Compare commits

...

36 Commits

Author SHA1 Message Date
Markus Frindt
86a14daf79 Update say task and add possibility to use elevenlabs options (#556)
* Update say task and add possibility to use elevenlabs options from synthesizer

* revert ms change

* fix contdition for alerting

---------

Co-authored-by: Markus Frindt <m.frindt@cognigy.com>
2023-12-01 09:51:22 -05:00
Hoan Luu Huu
c66ad39001 sending DTMF inband from updateCall/command (#536)
* sending DTMF inband from updateCall/command

* wip

* wip
2023-12-01 09:05:14 -05:00
Hoan Luu Huu
0a0cbd57ba support elevenlabs options (#553)
* support elevenlabs options

* elevenlabs options from synthezier

* wip

* fix
2023-11-30 09:28:12 -05:00
Hoan Luu Huu
eb2d90ffaa fix release freeswitch media properly (#550)
* fix release freeswitch media properly

* if a leg is opus, modify b leg offer opus first

* if a leg is opus, modify b leg offer opus first

* wip

* wip

* fix review comments

* fix review comments

* fix review comments
2023-11-29 10:17:15 -05:00
Anton Voylenko
454ff7d1b8 update envs (#551) 2023-11-28 14:54:39 -05:00
Markus Frindt
7e349fe4e5 Add a custom sanitization function for the "Tag" property (#546)
* Add a custom sanitization function for the tag property in create call body

* remove tag.* property from schema

* do not escape URLs but validate them

* apply suggestions from PR review

---------

Co-authored-by: Markus Frindt <m.frindt@cognigy.com>
2023-11-27 13:03:47 -05:00
Hoan Luu Huu
9478f3a1b8 fix disable gather timeout if interDigitTimeout enabled. (#538) 2023-11-21 19:09:09 -06:00
Hoan Luu Huu
a3c241b569 update drachtio-fsmrf new version for fixing amd and transcribe run parallel (#543) 2023-11-21 11:12:45 -06:00
Anton Voylenko
5a68563f96 live call control for tag (#539) 2023-11-17 08:48:17 -05:00
Hoan Luu Huu
1cdd0cf611 feat Audult call session should have its own requestor (#535)
* feat Audult call session should have its own requestor

* fix eslint

* fix eslint

* if user authenticate from http server instead of databse

* wip

* fix custom STT

* fix custom STT

* fix custom STT
2023-11-16 08:22:35 -05:00
Dave Horton
9ae4b04fc5 update badge 2023-11-14 10:27:16 -05:00
Anton Voylenko
170c3c7ec4 rest:dial: Support sipindialog hook (#531)
* support sipindialog hook

* rename sipindialog method
2023-11-14 09:16:08 -05:00
Dave Horton
7c36a08852 Feat/dial via sip proxy (#532)
* add support for dialing via proxy with target.proxy

* update verb-specifications with support for target.proxy
2023-11-14 09:06:43 -05:00
Hoan Luu Huu
633237da1b let realtimedb-help build configuration from env vars (#526)
* let realtimedb-help build configuration from env vars

* update speech-utils version
2023-11-14 08:57:50 -05:00
Dave Horton
708c2c661f 0.8.5 2023-11-09 12:36:04 -05:00
Hoan Luu Huu
87632c549e feat support Whisper TTS (#523)
* feat support openai

* update speech utils version
2023-11-09 09:51:20 -05:00
Hoan Luu Huu
31559cbb3b user restriction (#520) 2023-11-08 12:39:56 -05:00
Dave Horton
1156bae2de fix for #521 - allow pause in confirmHook applications (#522) 2023-11-07 09:46:41 -05:00
Dave Horton
c6c599ab99 fix dialogflow tts bug (tts not working due to 'default' being assigned to label) and update to drachtio-srf with fix for parsing sip:1234@feature-server (#518) 2023-11-06 15:05:17 -05:00
Dave Horton
4d0f0fe75f prevent exception referencing user.uri (#517) 2023-11-06 13:42:37 -05:00
Dave Horton
6d625d87ad Feat/assemblyai testing (#516)
* handle errors from assemblyai

* wip

* fix alert

* normalizeAssemblyai
2023-11-02 17:05:28 -04:00
Hoan Luu Huu
7fee2ba2dc feat assembly (#515)
* fix

* wip

* wip

* wip

* wip

* fix review comments
2023-11-02 09:25:04 -04:00
Hoan Luu Huu
4b3234f4e4 feat disable direct p2p call by env variable (#514)
* feat disable direct p2p call by env variable

* wip

* wip
2023-11-01 09:59:01 -04:00
Dave Horton
6b9f6a7d90 if dial.confirmHook returns empty array do not create a confirmSession (#513) 2023-11-01 08:47:41 -04:00
Dave Horton
3cdf568fb6 fix logging on child leg after REFER received on A leg (#512) 2023-10-31 20:55:39 -04:00
Hoan Luu Huu
e73bef4af0 google custom voice (#506)
* google custom voice

* fixed

* wip

* wip

* wip

* wip
2023-10-30 20:10:30 -04:00
Anton Voylenko
42d1069617 allow tag verb for waithook (#510) 2023-10-30 18:41:49 -04:00
Dave Horton
e5772d6b85 allow dial referHook to return application to execute on the other le… (#505)
* allow dial referHook to return application to execute on the other leg; fixes #504

* fix session tracking

* minor logging

* minor
2023-10-30 13:59:54 -04:00
Dave Horton
f43a5c1491 deepgram: rework continuous asr, and resolve on speech_final not is_f… (#501)
* deepgram: rework continuous asr, and resolve on speech_final not is_final (wip)

* wip

* deepgram: empty final transcript should trigger resolve with speech if we have buffered transcripts

* wip

* fixes for deepgram compiling multiple transcripts

* test deepgram utteranceEndMs

* more handling of utteranceEndMs

* wip

* better handling of digit strings collected over multiple deepgram responses

* wip

* add support for deepgramOptions.shortUtterance which triggers off of is_final instead of speech_final

* apply deepgram fixes to transcribe

* cleanup continnuous asr

* more continuous asr fixes for deepgram

* update to verb-specifications for handling SttTask properties

* set log level for tests back to error
2023-10-30 13:57:25 -04:00
Dave Horton
67f8f7181a #508 - add support for azureOptions.speechSegmentationSilenceTimeoutMs (#509)
* #508 - add support for azureOptions.speechSegmentationSilenceTimeoutMs

* update verb specs
2023-10-30 13:10:31 -04:00
Dave Horton
ddab4d7548 update to verb-specifications for handling SttTask properties 2023-10-28 14:20:26 -04:00
Dave Horton
916d988dbd add support for deepgram smart_format option (#500)
* add support for deepgram smart_format option

* handle nonexistent hints
2023-10-25 14:29:38 -04:00
Hoan Luu Huu
d6b74c3da8 Fix/dynamic apps (#499)
* refactor utilization jambonz app for queue and user

* refactor utilization jambonz app for queue and user

* refactor utilization jambonz app for queue and user

* refactor utilization jambonz app for queue and user

* refactor utilization jambonz app for queue and user
2023-10-24 20:16:01 -04:00
Hoan Luu Huu
3171b138f9 Feat/support device call (#495)
* support dequeue from registered user

* support dequeue from registered user

* wip

* wip

* wip

* wip

* fix review comments
2023-10-24 08:54:59 -04:00
Dave Horton
168b4dc051 fix #493: remove nulls from cerateCall payload in order not to trip up validation (#494) 2023-10-23 10:58:28 -04:00
Hoan Luu Huu
cf0f4d405f fix device call prioritize call to registered client first (#492) 2023-10-23 07:51:16 -04:00
29 changed files with 3404 additions and 515 deletions

View File

@@ -1,4 +1,4 @@
# jambonz-feature-server ![Build Status](https://github.com/jambonz/jambonz-feature-server/workflows/CI/badge.svg)
# jambonz-feature-server [![CI](https://github.com/jambonz/jambonz-feature-server/actions/workflows/build.yml/badge.svg)](https://github.com/jambonz/jambonz-feature-server/actions/workflows/build.yml)
This application implements the core feature server of the jambones platform.
@@ -40,6 +40,8 @@ Configuration is provided via environment variables:
|JAMBONZ_RECORD_WS_BASE_URL| recording websocket URL to send the recording audio|no|
|JAMBONZ_RECORD_WS_USERNAME| recording websocket username|no|
|JAMBONZ_RECORD_WS_PASSWORD| recording websocket password|no|
|ANCHOR_MEDIA_ALWAYS| keep media on media server|no|
|JAMBONZ_DISABLE_DIAL_PAI_HEADER| control P-Asserted-Identity header on B-Leg|no|
### running under pm2
Typically, this application runs under [pm2](https://pm2.io) using an [ecosystem.config.js](https://pm2.keymetrics.io/docs/usage/application-declaration/) file similar to this:

0
bin/k8s-pre-stop-hook.js Executable file → Normal file
View File

View File

@@ -28,10 +28,6 @@ const JAMBONES_MYSQL_PORT = parseInt(process.env.JAMBONES_MYSQL_PORT, 10) || 330
const JAMBONES_MYSQL_REFRESH_TTL = parseInt(process.env.JAMBONES_MYSQL_REFRESH_TTL, 10) || 0;
const JAMBONES_MYSQL_CONNECTION_LIMIT = parseInt(process.env.JAMBONES_MYSQL_CONNECTION_LIMIT, 10) || 10;
/* redis */
const JAMBONES_REDIS_HOST = process.env.JAMBONES_REDIS_HOST;
const JAMBONES_REDIS_PORT = parseInt(process.env.JAMBONES_REDIS_PORT, 10) || 6379;
/* gather and hints */
const JAMBONES_GATHER_EARLY_HINTS_MATCH = process.env.JAMBONES_GATHER_EARLY_HINTS_MATCH;
const JAMBONZ_GATHER_EARLY_HINTS_MATCH = process.env.JAMBONZ_GATHER_EARLY_HINTS_MATCH;
@@ -127,30 +123,11 @@ const HTTP_PROXY_PROTOCOL = process.env.JAMBONES_HTTP_PROXY_PROTOCOL || 'http';
const OPTIONS_PING_INTERVAL = parseInt(process.env.OPTIONS_PING_INTERVAL, 10) || 30000;
const JAMBONES_REDIS_SENTINELS = process.env.JAMBONES_REDIS_SENTINELS ? {
sentinels: process.env.JAMBONES_REDIS_SENTINELS.split(',').map((sentinel) => {
let host, port = 26379;
if (sentinel.includes(':')) {
const arr = sentinel.split(':');
host = arr[0];
port = parseInt(arr[1], 10);
} else {
host = sentinel;
}
return {host, port};
}),
name: process.env.JAMBONES_REDIS_SENTINEL_MASTER_NAME,
...(process.env.JAMBONES_REDIS_SENTINEL_PASSWORD && {
password: process.env.JAMBONES_REDIS_SENTINEL_PASSWORD
}),
...(process.env.JAMBONES_REDIS_SENTINEL_USERNAME && {
username: process.env.JAMBONES_REDIS_SENTINEL_USERNAME
})
} : null;
const JAMBONZ_RECORD_WS_BASE_URL = process.env.JAMBONZ_RECORD_WS_BASE_URL || process.env.JAMBONES_RECORD_WS_BASE_URL;
const JAMBONZ_RECORD_WS_USERNAME = process.env.JAMBONZ_RECORD_WS_USERNAME || process.env.JAMBONES_RECORD_WS_USERNAME;
const JAMBONZ_RECORD_WS_PASSWORD = process.env.JAMBONZ_RECORD_WS_PASSWORD || process.env.JAMBONES_RECORD_WS_PASSWORD;
const JAMBONZ_DISABLE_DIAL_PAI_HEADER = process.env.JAMBONZ_DISABLE_DIAL_PAI_HEADER || false;
const JAMBONES_DISABLE_DIRECT_P2P_CALL = process.env.JAMBONES_DISABLE_DIRECT_P2P_CALL || false;
module.exports = {
JAMBONES_MYSQL_HOST,
@@ -169,9 +146,6 @@ module.exports = {
JAMBONZ_GATHER_EARLY_HINTS_MATCH,
JAMBONES_GATHER_CLEAR_GLOBAL_HINTS_ON_EMPTY_HINTS,
JAMBONES_FREESWITCH,
JAMBONES_REDIS_HOST,
JAMBONES_REDIS_PORT,
JAMBONES_REDIS_SENTINELS,
SMPP_URL,
JAMBONES_NETWORK_CIDR,
JAMBONES_API_BASE_URL,
@@ -233,5 +207,6 @@ module.exports = {
JAMBONZ_RECORD_WS_BASE_URL,
JAMBONZ_RECORD_WS_USERNAME,
JAMBONZ_RECORD_WS_PASSWORD,
JAMBONZ_DISABLE_DIAL_PAI_HEADER
JAMBONZ_DISABLE_DIAL_PAI_HEADER,
JAMBONES_DISABLE_DIRECT_P2P_CALL
};

54
lib/dynamic-apps.js Normal file
View File

@@ -0,0 +1,54 @@
const appsMap = {
queue: {
// Dummy hook to follow later feature server logic.
call_hook: {
url: 'https://jambonz.org',
method: 'GET'
},
account_sid: '',
app_json: [{
verb: 'dequeue',
name: '',
timeout: 5
}]
},
user: {
// Dummy hook to follow later feature server logic.
call_hook: {
url: 'https://jambonz.org',
method: 'GET'
},
account_sid: '',
app_json: [{
verb: 'dial',
callerId: '',
answerOnBridge: true,
target: [
{
type: 'user',
name: ''
}
]
}]
}
};
const createJambonzApp = (type, {account_sid, name, caller_id}) => {
const app = {...appsMap[type]};
app.account_sid = account_sid;
switch (type) {
case 'queue':
app.app_json[0].name = name;
break;
case 'user':
app.app_json[0].callerId = caller_id;
app.app_json[0].target[0].name = name;
break;
}
app.app_json = JSON.stringify(app.app_json);
return app;
};
module.exports = {
createJambonzApp
};

View File

@@ -5,7 +5,7 @@ const CallInfo = require('../../session/call-info');
const {CallDirection, CallStatus} = require('../../utils/constants');
const uuidv4 = require('uuid-random');
const SipError = require('drachtio-srf').SipError;
const { validationResult } = require('express-validator');
const { validationResult, body } = require('express-validator');
const { validate } = require('@jambonz/verb-specifications');
const sysError = require('./error');
const HttpRequestor = require('../../utils/http-requestor');
@@ -13,16 +13,30 @@ const WsRequestor = require('../../utils/ws-requestor');
const RootSpan = require('../../utils/call-tracer');
const dbUtils = require('../../utils/db-utils');
const { mergeSdpMedia, extractSdpMedia } = require('../../utils/sdp-utils');
const { createCallSchema } = require('../schemas/create-call');
const { createCallSchema, customSanitizeFunction } = require('../schemas/create-call');
const removeNullProperties = (obj) => (Object.keys(obj).forEach((key) => obj[key] === null && delete obj[key]), obj);
const removeNulls = (req, res, next) => {
req.body = removeNullProperties(req.body);
next();
};
router.post('/',
removeNulls,
createCallSchema,
body('tag').custom((value) => {
if (value) {
customSanitizeFunction(value);
}
return true;
}),
async(req, res) => {
const {logger} = req.app.locals;
const errors = validationResult(req);
if (!errors.isEmpty()) {
logger.info({errors: errors.array()}, 'POST /Calls: validation errors');
return res.status(400).json({ errors: errors.array() });
}
const {logger} = req.app.locals;
const accountSid = req.body.account_sid;
const {srf} = require('../../..');

View File

@@ -45,7 +45,7 @@ function retrieveCallSession(callSid, opts) {
router.post('/:callSid', async(req, res) => {
const logger = req.app.locals.logger;
const callSid = req.params.callSid;
logger.debug({body: req.body}, 'got upateCall request');
logger.debug({body: req.body}, 'got updateCall request');
try {
const cs = retrieveCallSession(callSid, req.body);
if (!cs) {

View File

@@ -46,11 +46,6 @@ const createCallSchema = checkSchema({
optional: true,
errorMessage: 'Invalid tag',
},
'tag.*': {
trim: true,
escape: true,
stripLow: true,
},
app_json: {
isString: true,
optional: true,
@@ -109,6 +104,34 @@ const createCallSchema = checkSchema({
}
}, ['body']);
module.exports = {
createCallSchema
const customSanitizeFunction = (value) => {
try {
if (Array.isArray(value)) {
value = value.map((item) => customSanitizeFunction(item));
} else if (typeof value === 'object') {
Object.keys(value).forEach((key) => {
value[key] = customSanitizeFunction(value[key]);
});
} else if (typeof value === 'string') {
/* trims characters at the beginning and at the end of a string */
value = value.trim();
/* We don't escape URLs but verify them via new URL */
if (value.includes('http')) {
value = new URL(value).toString();
} else {
/* replaces <, >, &, ', " and / with their corresponding HTML entities */
value = escape(value);
}
}
} catch (error) {
value = `Error: ${error.message}`;
}
return value;
};
module.exports = {
createCallSchema,
customSanitizeFunction
};

View File

@@ -11,8 +11,10 @@ const dbUtils = require('./utils/db-utils');
const RootSpan = require('./utils/call-tracer');
const listTaskNames = require('./utils/summarize-tasks');
const {
JAMBONES_MYSQL_REFRESH_TTL
JAMBONES_MYSQL_REFRESH_TTL,
JAMBONES_DISABLE_DIRECT_P2P_CALL
} = require('./config');
const { createJambonzApp } = require('./dynamic-apps');
module.exports = function(srf, logger) {
const {
@@ -21,17 +23,20 @@ module.exports = function(srf, logger) {
lookupAppBySid,
lookupAppByRealm,
lookupAppByTeamsTenant,
registrar
registrar,
lookupClientByAccountAndUsername
} = srf.locals.dbHelpers;
const {
writeAlerts,
AlertType
} = srf.locals;
const {lookupAccountDetails} = dbUtils(logger, srf);
const {lookupAccountDetails, lookupGoogleCustomVoice} = dbUtils(logger, srf);
function initLocals(req, res, next) {
async function initLocals(req, res, next) {
const callId = req.get('Call-ID');
const uri = parseUri(req.uri);
logger.info({
uri,
callId,
callingNumber: req.callingNumber,
calledNumber: req.calledNumber
@@ -43,12 +48,46 @@ module.exports = function(srf, logger) {
const callSid = req.has('X-Retain-Call-Sid') ? req.get('X-Retain-Call-Sid') : uuidv4();
const account_sid = req.get('X-Account-Sid');
req.locals = {callSid, account_sid, callId};
if (req.has('X-Application-Sid')) {
let clientDb = null;
if (req.has('X-Authenticated-User')) {
req.locals.originatingUser = req.get('X-Authenticated-User');
const arr = /^(.*)@(.*)/.exec(req.locals.originatingUser);
if (arr) {
[clientDb] = await lookupClientByAccountAndUsername(account_sid, arr[1]);
}
}
// check for call to application
if (uri.user?.startsWith('app-') && req.locals.originatingUser && clientDb?.allow_direct_app_calling) {
const application_sid = uri.user.match(/app-(.*)/)[1];
logger.debug(`got application from Request URI header: ${application_sid}`);
req.locals.application_sid = application_sid;
} else if (req.has('X-Application-Sid')) {
const application_sid = req.get('X-Application-Sid');
logger.debug(`got application from X-Application-Sid header: ${application_sid}`);
req.locals.application_sid = application_sid;
}
if (req.has('X-Authenticated-User')) req.locals.originatingUser = req.get('X-Authenticated-User');
// check for call to queue
if (uri.user?.startsWith('queue-') && req.locals.originatingUser && clientDb?.allow_direct_queue_calling) {
const queue_name = uri.user.match(/queue-(.*)/)[1];
logger.debug(`got Queue from Request URI header: ${queue_name}`);
req.locals.queue_name = queue_name;
}
// check for call to registered user
if (!JAMBONES_DISABLE_DIRECT_P2P_CALL && req.locals.originatingUser && clientDb?.allow_direct_user_calling) {
const arr = /^(.*)@(.*)/.exec(req.locals.originatingUser);
if (arr) {
const sipRealm = arr[2];
const called_user = `${req.calledNumber}@${sipRealm}`;
const reg = await registrar.query(called_user);
if (reg) {
logger.debug(`got called Number is a registered user: ${called_user}`);
req.locals.called_user = called_user;
}
}
}
if (req.has('X-MS-Teams-Tenant-FQDN')) req.locals.msTeamsTenant = req.get('X-MS-Teams-Tenant-FQDN');
if (req.has('X-Cisco-Recording-Participant')) {
const ciscoParticipants = req.get('X-Cisco-Recording-Participant');
@@ -185,8 +224,16 @@ module.exports = function(srf, logger) {
const {span} = rootSpan.startChildSpan('lookupApplication');
try {
let app;
if (req.locals.application_sid) app = await lookupAppBySid(req.locals.application_sid);
else if (req.locals.originatingUser) {
if (req.locals.queue_name) {
logger.debug(`calling to queue ${req.locals.queue_name}, generating queue app`);
app = createJambonzApp('queue', {account_sid, name: req.locals.queue_name});
} else if (req.locals.called_user) {
logger.debug(`calling to registered user ${req.locals.called_user}, generating dial app`);
app = createJambonzApp('user',
{account_sid, name: req.locals.called_user, caller_id: req.locals.callingNumber});
} else if (req.locals.application_sid) {
app = await lookupAppBySid(req.locals.application_sid);
} else if (req.locals.originatingUser) {
const arr = /^(.*)@(.*)/.exec(req.locals.originatingUser);
if (arr) {
const sipRealm = arr[2];
@@ -194,34 +241,6 @@ module.exports = function(srf, logger) {
app = await lookupAppByRealm(sipRealm);
if (app) {
logger.debug({app}, `retrieved device calling app for realm ${sipRealm}`);
} else {
const calledAor = `${req.calledNumber}@${sipRealm}`;
const reg = await registrar.query(calledAor);
if (reg) {
logger.debug(`There is no device app found, called client is registered,
forwarding call to called client.`);
app = {
// Dummy hook to follow later feature server logic.
call_hook: {
url: 'https://jambonz.org',
method: 'GET'
},
account_sid,
app_json: JSON.stringify(
[{
verb: 'dial',
callerId: arr[1],
answerOnBridge: true,
target: [
{
type: 'user',
name: calledAor
}
]
}]
)
};
}
}
}
}
@@ -290,6 +309,21 @@ module.exports = function(srf, logger) {
else app2.notifier = {request: () => {}, close: () => {}};
}
// Resolve application.speech_synthesis_voice if it's custom voice
if (app2.speech_synthesis_vendor === 'google' && app2.speech_synthesis_voice.startsWith('custom_')) {
const arr = /custom_(.*)/.exec(app2.speech_synthesis_voice);
if (arr) {
const google_custom_voice_sid = arr[1];
const [custom_voice] = await lookupGoogleCustomVoice(google_custom_voice_sid);
if (custom_voice) {
app2.speech_synthesis_voice = {
reportedUsage: custom_voice.reported_usage,
model: custom_voice.model
};
}
}
}
req.locals.application = app2;
// eslint-disable-next-line no-unused-vars

View File

@@ -19,6 +19,7 @@ class AdultingCallSession extends CallSession {
rootSpan
});
this.sd = singleDialer;
this.req = callInfo.req;
this.sd.dlg.on('destroy', () => {
this.logger.info('AdultingCallSession: called party hung up');

View File

@@ -831,6 +831,17 @@ class CallSession extends Emitter {
cobalt_server_uri: credential.cobalt_server_uri
};
} else if ('elevenlabs' === vendor) {
return {
api_key: credential.api_key,
model_id: credential.model_id,
options: credential.options
};
} else if ('assemblyai' === vendor) {
return {
speech_credential_sid: credential.speech_credential_sid,
api_key: credential.api_key
};
} else if ('whisper' === vendor) {
return {
api_key: credential.api_key,
model_id: credential.model_id
@@ -1111,6 +1122,21 @@ class CallSession extends Emitter {
transcribeTask.updateTranscribe(opts.transcribe_status);
}
/**
* perform live call control -- update customer data
* @param {object} opts
* @param {object} opts.tag - customer data
*/
_lccTag(opts) {
const {tag} = opts;
if (typeof tag !== 'object' || Array.isArray(tag) || tag === null) {
this.logger.info('CallSession:_lccTag - invalid tag data');
return;
}
this.logger.debug({customerData: tag}, 'CallSession:_lccTag set customer data in callInfo');
this.callInfo.customerData = tag;
}
async _lccMuteStatus(callSid, mute) {
// this whole thing requires us to be in a Dial or Conference verb
const task = this.currentTask;
@@ -1161,6 +1187,38 @@ class CallSession extends Emitter {
}
}
/**
* perform live call control - send RFC 2833 DTMF
* @param {obj} opts
* @param {string} opts.dtmf.digit - DTMF digit
* @param {string} opts.dtmf.duration - Optional, Duration
*/
async _lccDtmf(opts, callSid) {
const {dtmf} = opts;
const {digit, duration = 250} = dtmf;
if (!this.hasStableDialog) {
this.logger.info('CallSession:_lccDtmf - invalid command as we do not have a stable call');
return;
}
try {
const dlg = callSid === this.callSid ? this.dlg : this.currentTask.dlg;
const res = await dlg.request({
method: 'INFO',
headers: {
'Content-Type': 'application/dtmf',
'X-Reason': 'Dtmf'
},
body: `Signal=${digit}
Duration=${duration} `
});
this.logger.debug({res}, `CallSession:_lccDtmf
got response to INFO DTMF digit=${digit} and duration=${duration}`);
return res;
} catch (err) {
this.logger.error({err}, 'CallSession:_lccDtmf - error sending INFO RFC 2833 DTMF');
}
}
/**
* perform live call control -- whisper to one party or the other on a call
* @param {array} opts - array of play or say tasks
@@ -1215,7 +1273,7 @@ class CallSession extends Emitter {
/**
* perform live call control
* @param {object} opts - update instructions
* @param {string} callSid - identifies call toupdate
* @param {string} callSid - identifies call to update
*/
async updateCall(opts, callSid) {
this.logger.debug(opts, 'CallSession:updateCall');
@@ -1244,13 +1302,18 @@ class CallSession extends Emitter {
else if (opts.sip_request) {
const res = await this._lccSipRequest(opts, callSid);
return {status: res.status, reason: res.reason};
} else if (opts.dtmf) {
await this._lccDtmf(opts, callSid);
}
else if (opts.record) {
await this.notifyRecordOptions(opts.record);
}
else if (opts.tag) {
return this._lccTag(opts);
}
// whisper may be the only thing we are asked to do, or it may that
// we are doing a whisper after having muted, paused reccording etc..
// we are doing a whisper after having muted, paused recording etc..
if (opts.whisper) {
return this._lccWhisper(opts, callSid);
}
@@ -1423,6 +1486,13 @@ class CallSession extends Emitter {
});
break;
case 'dtmf':
this._lccDtmf(data, call_sid)
.catch((err) => {
this.logger.info({err, data}, `CallSession:_onCommand - error sending RFC 2833 DTMF ${data}`);
});
break;
default:
this.logger.info(`CallSession:_onCommand - invalid command ${command}`);
}

View File

@@ -17,7 +17,7 @@ const dbUtils = require('../utils/db-utils');
const debug = require('debug')('jambonz:feature-server');
const {parseUri} = require('drachtio-srf');
const {ANCHOR_MEDIA_ALWAYS, JAMBONZ_DISABLE_DIAL_PAI_HEADER} = require('../config');
const { isOnhold } = require('../utils/sdp-utils');
const { isOnhold, isOpusFirst } = require('../utils/sdp-utils');
const { normalizeJambones } = require('@jambonz/verb-specifications');
function parseDtmfOptions(logger, dtmfCapture) {
@@ -322,7 +322,7 @@ class TaskDial extends Task {
const to = parseUri(req.getParsedHeader('Refer-To').uri);
const by = parseUri(req.getParsedHeader('Referred-By').uri);
this.logger.info({to}, 'refer to parsed');
await cs.requestor.request('verb:hook', this.referHook, {
const json = await cs.requestor.request('verb:hook', this.referHook, {
...callInfo,
refer_details: {
sip_refer_to: req.get('Refer-To'),
@@ -334,6 +334,29 @@ class TaskDial extends Task {
referred_call_sid
}
}, httpHeaders);
if (json && Array.isArray(json)) {
try {
const logger = isChild ? this.logger : this.sd.logger;
const tasks = normalizeJambones(logger, json).map((tdata) => makeTask(this.logger, tdata));
if (tasks && tasks.length > 0) {
const legs = isChild ? ['child', 'parent'] : ['parent', 'child'];
logger.info(`Dial:handleRefer received REFER on ${legs[0]} leg, setting new app on ${legs[1]} leg`);
if (isChild) this.redirect(cs, tasks);
else {
logger.info({tasks: json}, 'Dial:handleRefer - new application for for child leg');
const adultingSession = await this.sd.doAdulting({
logger,
application: cs.application,
tasks
});
/* need to update the callSid of the child with its own (new) AdultingCallSession */
sessionTracker.add(adultingSession.callSid, adultingSession);
}
}
} catch (err) {
this.logger.info(err, 'Dial:handleRefer - error setting new application after receiving REFER');
}
}
res.send(202);
this.logger.info('DialTask:handleRefer - sent 202 Accepted');
} catch (err) {
@@ -465,7 +488,8 @@ class TaskDial extends Task {
headers: this.headers,
proxy: `sip:${sbcAddress}`,
callingNumber: this.callerId || req.callingNumber,
...(this.callerName && {callingName: this.callerName})
...(this.callerName && {callingName: this.callerName}),
opusFirst: isOpusFirst(this.cs.ep.remote.sdp)
};
const t = this.target.find((t) => t.type === 'teams');
@@ -767,9 +791,11 @@ class TaskDial extends Task {
assert(cs.ep && sd.ep);
try {
// Wait until we got new SDP from B leg to ofter to A Leg
const aLegSdp = cs.ep.remote.sdp;
await sd.releaseMediaToSBC(aLegSdp, cs.ep.local.sdp);
const bLegSdp = sd.dlg.remote.sdp;
await Promise.all[sd.releaseMediaToSBC(aLegSdp, cs.ep.local.sdp), cs.releaseMediaToSBC(bLegSdp)];
await cs.releaseMediaToSBC(bLegSdp);
this.epOther = null;
this.logger.info('Dial:_releaseMedia - successfully released media from freewitch');
} catch (err) {

View File

@@ -58,13 +58,13 @@ class Dialogflow extends Task {
this.vendor = this.data.tts.vendor || 'default';
this.language = this.data.tts.language || 'default';
this.voice = this.data.tts.voice || 'default';
this.speechSynthesisLabel = this.data.tts.label || 'default';
this.speechSynthesisLabel = this.data.tts.label;
// fallback tts
this.fallbackVendor = this.data.tts.fallbackVendor || 'default';
this.fallbackLanguage = this.data.tts.fallbackLanguage || 'default';
this.fallbackVoice = this.data.tts.fallbackLanguage || 'default';
this.fallbackLabel = this.data.tts.fallbackLabel || 'default';
this.fallbackLabel = this.data.tts.fallbackLabel;
}
this.bargein = this.data.bargein;
}

View File

@@ -311,7 +311,8 @@ class TaskEnqueue extends Task {
}
}
async _playHook(cs, dlg, hook, allowed = [TaskName.Play, TaskName.Say, TaskName.Pause, TaskName.Leave]) {
async _playHook(cs, dlg, hook,
allowed = [TaskName.Play, TaskName.Say, TaskName.Pause, TaskName.Leave, TaskName.Tag]) {
const {sortedSetLength, sortedSetPositionByPattern} = cs.srf.locals.dbHelpers;
const b3 = this.getTracingPropagation();
const httpHeaders = b3 && {b3};
@@ -331,6 +332,7 @@ class TaskEnqueue extends Task {
queuePosition: queuePosition.length ? queuePosition[0] : 0,
callSid: this.cs.callSid,
callId: this.cs.callId,
customerData: this.cs.callInfo.customerData
});
} catch (err) {
this.logger.error({err}, `TaskEnqueue:_playHook error retrieving list info for queue ${this.queueName}`);

View File

@@ -9,8 +9,9 @@ const {
CobaltTranscriptionEvents,
IbmTranscriptionEvents,
NvidiaTranscriptionEvents,
JambonzTranscriptionEvents
} = require('../utils/constants');
JambonzTranscriptionEvents,
AssemblyAiTranscriptionEvents
} = require('../utils/constants.json');
const {
JAMBONES_GATHER_EARLY_HINTS_MATCH,
JAMBONZ_GATHER_EARLY_HINTS_MATCH,
@@ -20,16 +21,6 @@ const makeTask = require('./make_task');
const assert = require('assert');
const SttTask = require('./stt-task');
const compileTranscripts = (logger, evt, arr) => {
if (!Array.isArray(arr) || arr.length === 0) return;
let t = '';
for (const a of arr) {
t += ` ${a.alternatives[0].transcript}`;
}
t += ` ${evt.alternatives[0].transcript}`;
evt.alternatives[0].transcript = t.trim();
};
class TaskGather extends SttTask {
constructor(logger, opts, parentTask) {
super(logger, opts, parentTask);
@@ -51,8 +42,10 @@ class TaskGather extends SttTask {
/* continuous ASR (i.e. compile transcripts until a special timeout or dtmf key) */
this.asrTimeout = typeof this.data.recognizer.asrTimeout === 'number' ?
this.data.recognizer.asrTimeout * 1000 : 0;
if (this.asrTimeout > 0) this.asrDtmfTerminationDigit = this.data.recognizer.asrDtmfTerminationDigit;
this.isContinuousAsr = this.asrTimeout > 0;
if (this.asrTimeout > 0) {
this.isContinuousAsr = true;
this.asrDtmfTerminationDigit = this.data.recognizer.asrDtmfTerminationDigit;
}
if (Array.isArray(this.data.recognizer.hints) &&
0 == this.data.recognizer.hints.length && JAMBONES_GATHER_CLEAR_GLOBAL_HINTS_ON_EMPTY_HINTS) {
@@ -351,6 +344,13 @@ class TaskGather extends SttTask {
async _setSpeechHandlers(cs, ep) {
if (this._speechHandlersSet) return;
this._speechHandlersSet = true;
/* some special deepgram logic */
if (this.vendor === 'deepgram') {
if (this.isContinuousAsr) this._doContinuousAsrWithDeepgram(this.asrTimeout);
if (this.data.recognizer?.deepgramOptions?.shortUtterance) this.shortUtterance = true;
}
const opts = this.setChannelVarsForStt(this, this.sttCredentials, this.data.recognizer);
switch (this.vendor) {
case 'google':
@@ -393,9 +393,12 @@ class TaskGather extends SttTask {
case 'deepgram':
this.bugname = 'deepgram_transcribe';
ep.addCustomEventListener(DeepgramTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
ep.addCustomEventListener(DeepgramTranscriptionEvents.Connect, this._onDeepgramConnect.bind(this, cs, ep));
ep.addCustomEventListener(DeepgramTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
ep.addCustomEventListener(DeepgramTranscriptionEvents.ConnectFailure,
this._onDeepGramConnectFailure.bind(this, cs, ep));
this._onVendorConnectFailure.bind(this, cs, ep));
/* if app sets deepgramOptions.utteranceEndMs they essentially want continuous asr */
if (opts.DEEPGRAM_SPEECH_UTTERANCE_END_MS) this.isContinuousAsr = true;
break;
case 'soniox':
@@ -436,9 +439,9 @@ class TaskGather extends SttTask {
case 'ibm':
this.bugname = 'ibm_transcribe';
ep.addCustomEventListener(IbmTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
ep.addCustomEventListener(IbmTranscriptionEvents.Connect, this._onIbmConnect.bind(this, cs, ep));
ep.addCustomEventListener(IbmTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
ep.addCustomEventListener(IbmTranscriptionEvents.ConnectFailure,
this._onIbmConnectFailure.bind(this, cs, ep));
this._onVendorConnectFailure.bind(this, cs, ep));
break;
case 'nvidia':
@@ -458,13 +461,22 @@ class TaskGather extends SttTask {
}
break;
case 'assemblyai':
this.bugname = 'assemblyai_transcribe';
ep.addCustomEventListener(AssemblyAiTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep));
ep.addCustomEventListener(AssemblyAiTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
ep.addCustomEventListener(AssemblyAiTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
ep.addCustomEventListener(AssemblyAiTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
break;
default:
if (this.vendor.startsWith('custom:')) {
this.bugname = `${this.vendor}_transcribe`;
ep.addCustomEventListener(JambonzTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
ep.addCustomEventListener(JambonzTranscriptionEvents.Connect, this._onJambonzConnect.bind(this, cs, ep));
ep.addCustomEventListener(JambonzTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
ep.addCustomEventListener(JambonzTranscriptionEvents.ConnectFailure,
this._onJambonzConnectFailure.bind(this, cs, ep));
this._onVendorConnectFailure.bind(this, cs, ep));
break;
}
else {
@@ -487,6 +499,12 @@ class TaskGather extends SttTask {
interim: this.interim,
bugname: this.bugname
}, 'Gather:_startTranscribing');
/**
* Note: we don't need to ask deepgram for interim results, because they
* already send us words as they are finalized (is_final=true) even before
* the utterance is finalized (speech_final=true)
*/
ep.startTranscription({
vendor: this.vendor,
locale: this.language,
@@ -510,7 +528,9 @@ class TaskGather extends SttTask {
this._clearTimer();
this._timeoutTimer = setTimeout(() => {
if (this.isContinuousAsr) this._startAsrTimer();
else this._resolve(this.digitBuffer.length >= this.minDigits ? 'dtmf-num-digits' : 'timeout');
else if (this.interDigitTimeout <= 0 || this.digitBuffer.length < this.minDigits || this.needsStt) {
this._resolve(this.digitBuffer.length >= this.minDigits ? 'dtmf-num-digits' : 'timeout');
}
}, this.timeout);
}
@@ -522,11 +542,13 @@ class TaskGather extends SttTask {
}
_startAsrTimer() {
if (this.vendor === 'deepgram') return; // no need
assert(this.isContinuousAsr);
this._clearAsrTimer();
this._asrTimer = setTimeout(() => {
this.logger.debug('_startAsrTimer - asr timer went off');
this._resolve(this._bufferedTranscripts.length > 0 ? 'speech' : 'timeout');
const evt = this.consolidateTranscripts(this._bufferedTranscripts, 1, this.language);
this._resolve(this._bufferedTranscripts.length > 0 ? 'speech' : 'timeout', evt);
}, this.asrTimeout);
this.logger.debug(`_startAsrTimer: set for ${this.asrTimeout}ms`);
}
@@ -556,7 +578,8 @@ class TaskGather extends SttTask {
this._clearFinalAsrTimer();
this._finalAsrTimer = setTimeout(() => {
this.logger.debug('_startFinalAsrTimer - final asr timer went off');
this._resolve(this._bufferedTranscripts.length > 0 ? 'speech' : 'timeout');
const evt = this.consolidateTranscripts(this._bufferedTranscripts, 1, this.language);
this._resolve(this._bufferedTranscripts.length > 0 ? 'speech' : 'timeout', evt);
}, 1000);
this.logger.debug('_startFinalAsrTimer: set for 1 second');
}
@@ -595,11 +618,23 @@ class TaskGather extends SttTask {
this.logger.debug({evt, bugname, finished}, `Gather:_onTranscription for vendor ${this.vendor}`);
if (bugname && this.bugname !== bugname) return;
if (this.vendor === 'ibm') {
if (evt?.state === 'listening') return;
if (this.vendor === 'ibm' && evt?.state === 'listening') return;
if (this.vendor === 'deepgram' && evt.type === 'UtteranceEnd') {
/* we will only get this when we have set utterance_end_ms */
if (this._bufferedTranscripts.length === 0) {
this.logger.debug('Gather:_onTranscription - got UtteranceEnd event from deepgram but no buffered transcripts');
}
else {
this.logger.debug('Gather:_onTranscription - got UtteranceEnd event from deepgram, return buffered transcript');
evt = this.consolidateTranscripts(this._bufferedTranscripts, 1, this.language);
this._bufferedTranscripts = [];
this._resolve('speech', evt);
}
return;
}
evt = this.normalizeTranscription(evt, this.vendor, 1, this.language);
evt = this.normalizeTranscription(evt, this.vendor, 1, this.language, this.shortUtterance);
if (evt.alternatives.length === 0) {
this.logger.info({evt}, 'TaskGather:_onTranscription - got empty transcript, continue listening');
return;
@@ -621,15 +656,27 @@ class TaskGather extends SttTask {
const bufferedWords = this._sonioxTranscripts.length +
this._bufferedTranscripts.reduce((count, e) => count + e.alternatives[0]?.transcript.split(' ').length, 0);
let emptyTranscript = false;
if (evt.is_final) {
if (evt.alternatives[0].transcript === '' && !this.callSession.callGone && !this.killed) {
emptyTranscript = true;
if (finished === 'true' && ['microsoft', 'deepgram'].includes(this.vendor)) {
this.logger.debug({evt}, 'TaskGather:_onTranscription - got empty transcript from old gather, disregarding');
return;
}
else {
else if (this.vendor !== 'deepgram') {
this.logger.info({evt}, 'TaskGather:_onTranscription - got empty transcript, continue listening');
return;
}
else if (this.isContinuousAsr) {
this.logger.info({evt},
'TaskGather:_onTranscription - got empty deepgram transcript during continous asr, continue listening');
return;
}
else if (this.vendor === 'deepgram' && this._bufferedTranscripts.length > 0) {
this.logger.info({evt},
'TaskGather:_onTranscription - got empty transcript from deepgram, return the buffered transcripts');
}
return;
}
if (this.isContinuousAsr) {
@@ -641,14 +688,14 @@ class TaskGather extends SttTask {
this.logger.debug('TaskGather:_onTranscription - removing trailing punctuation');
evt.alternatives[0].transcript = t.slice(0, -1);
}
else this.logger.debug({t}, 'TaskGather:_onTranscription - no trailing punctuation');
}
this.logger.info({evt}, 'TaskGather:_onTranscription - got transcript during continous asr');
this._bufferedTranscripts.push(evt);
this._clearTimer();
if (this._finalAsrTimer) {
this._clearFinalAsrTimer();
return this._resolve(this._bufferedTranscripts.length > 0 ? 'speech' : 'timeout');
const evt = this.consolidateTranscripts(this._bufferedTranscripts, 1, this.language);
return this._resolve(this._bufferedTranscripts.length > 0 ? 'speech' : 'timeout', evt);
}
this._startAsrTimer();
@@ -670,16 +717,25 @@ class TaskGather extends SttTask {
evt = this.compileSonioxTranscripts(this._sonioxTranscripts, 1, this.language);
this._sonioxTranscripts = [];
}
else if (this.vendor === 'deepgram') {
/* compile transcripts into one */
if (!emptyTranscript) this._bufferedTranscripts.push(evt);
if (this.data.recognizer?.deepgramOptions?.utteranceEndMs) {
this.logger.debug('TaskGather:_onTranscription - got speech_final waiting for UtteranceEnd event');
return;
}
this.logger.debug({evt}, 'TaskGather:_onTranscription - compiling deepgram transcripts');
evt = this.consolidateTranscripts(this._bufferedTranscripts, 1, this.language);
this._bufferedTranscripts = [];
this.logger.debug({evt}, 'TaskGather:_onTranscription - compiled deepgram transcripts');
}
/* here is where we return a final transcript */
this._resolve('speech', evt);
}
}
}
else {
/* google has a measure of stability:
https://cloud.google.com/speech-to-text/docs/basics#streaming_responses
others do not.
*/
//const isStableEnough = typeof evt.stability === 'undefined' || evt.stability > GATHER_STABILITY_THRESHOLD;
this._clearTimer();
this._startTimer();
if (this.bargein && (words + bufferedWords) >= this.minBargeinWordCount) {
@@ -705,6 +761,14 @@ class TaskGather extends SttTask {
this._sonioxTranscripts.push(evt.vendor.finalWords);
}
}
/* deepgram can send a non-final transcript but with words that are final, so we need to buffer */
if (this.vendor === 'deepgram') {
const originalEvent = evt.vendor.evt;
if (originalEvent.is_final && evt.alternatives[0].transcript !== '') {
this.logger.debug({evt}, 'Gather:_onTranscription - buffering a completed (partial) deepgram transcript');
this._bufferedTranscripts.push(evt);
}
}
}
}
_onEndOfUtterance(cs, ep) {
@@ -719,7 +783,7 @@ class TaskGather extends SttTask {
* getting a transcription. This can happen if someone coughs or mumbles.
* For that reason don't ask for a single utterance and we'll terminate the transcribe operation
* once we get a final transcript.
* However, if the usr has specified a singleUtterance, then we need to restart here
* However, if the user has specified a singleUtterance, then we need to restart here
* since we dont have a final transcript yet.
*/
if (!this.resolved && !this.killed && !this._bufferedTranscripts.length && this.wantsSingleUtterance) {
@@ -736,12 +800,6 @@ class TaskGather extends SttTask {
_onTranscriptionComplete(cs, ep) {
this.logger.debug('TaskGather:_onTranscriptionComplete');
}
_onDeepgramConnect(_cs, _ep) {
this.logger.debug('TaskGather:_onDeepgramConnect');
}
_onJambonzConnect(_cs, _ep) {
this.logger.debug('TaskGather:_onJambonzConnect');
}
async _onJambonzError(cs, ep, evt) {
this.logger.info({evt}, 'TaskGather:_onJambonzError');
if (this.isHandledByPrimaryProvider && this.fallbackVendor) {
@@ -775,54 +833,16 @@ class TaskGather extends SttTask {
this.notifyError({msg: 'ASR error', details:`Custom speech vendor ${this.vendor} error: ${evt.error}`});
}
_onDeepGramConnectFailure(cs, _ep, evt) {
const {reason} = evt;
const {writeAlerts, AlertType} = cs.srf.locals;
this.logger.info({evt}, 'TaskGather:_onDeepgramConnectFailure');
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.STT_FAILURE,
message: `Failed connecting to Deepgram speech recognizer: ${reason}`,
vendor: 'deepgram',
}).catch((err) => this.logger.info({err}, 'Error generating alert for deepgram connection failure'));
this.notifyError({msg: 'ASR error', details:`Failed connecting to speech vendor deepgram: ${reason}`});
this.notifyTaskDone();
}
_onJambonzConnectFailure(cs, _ep, evt) {
const {reason} = evt;
const {writeAlerts, AlertType} = cs.srf.locals;
this.logger.info({evt}, 'TaskGather:_onJambonzConnectFailure');
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.STT_FAILURE,
message: `Failed connecting to ${this.vendor} speech recognizer: ${reason}`,
vendor: this.vendor,
}).catch((err) => this.logger.info({err}, 'Error generating alert for jambonz custom connection failure'));
this.notifyError({msg: 'ASR error', details:`Failed connecting to speech vendor ${this.vendor}: ${reason}`});
_onVendorConnectFailure(cs, _ep, evt) {
super._onVendorConnectFailure(cs, _ep, evt);
this.notifyTaskDone();
}
_onIbmConnect(_cs, _ep) {
this.logger.debug('TaskGather:_onIbmConnect');
_onVendorError(cs, _ep, evt) {
super._onVendorError(cs, _ep, evt);
this._resolve('stt-error', evt);
}
_onIbmConnectFailure(cs, _ep, evt) {
const {reason} = evt;
const {writeAlerts, AlertType} = cs.srf.locals;
this.logger.info({evt}, 'TaskGather:_onIbmConnectFailure');
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.STT_FAILURE,
message: `Failed connecting to IBM watson speech recognizer: ${reason}`,
vendor: 'ibm',
}).catch((err) => this.logger.info({err}, 'Error generating alert for IBM connection failure'));
this.notifyError({msg: 'ASR error', details:`Failed connecting to speech vendor IBM: ${reason}`});
this.notifyTaskDone();
}
_onIbmError(cs, _ep, evt) {
this.logger.info({evt}, 'TaskGather:_onIbmError'); }
_onVadDetected(cs, ep) {
if (this.bargein && this.minBargeinWordCount === 0) {
this.logger.debug('TaskGather:_onVadDetected');
@@ -858,18 +878,6 @@ class TaskGather extends SttTask {
this._clearTimer();
this._clearFastRecognitionTimer();
if (this.isContinuousAsr && reason.startsWith('speech')) {
evt = {
is_final: true,
transcripts: this._bufferedTranscripts
};
this.logger.debug({evt}, 'TaskGather:resolve continuous asr');
}
else if (!this.isContinuousAsr && reason.startsWith('speech') && this._bufferedTranscripts.length) {
compileTranscripts(this.logger, evt, this._bufferedTranscripts);
this.logger.debug({evt}, 'TaskGather:resolve buffered results');
}
this.span.setAttributes({
channel: 1,
'stt.resolve': reason,
@@ -908,6 +916,13 @@ class TaskGather extends SttTask {
await this.performAction({reason: 'timeout'});
}
}
else if (reason.startsWith('stt-error')) {
if (this.parentTask) this.parentTask.emit('stt-error', evt);
else {
this.emit('stt-error', evt);
await this.performAction({reason: 'error', details: evt.error});
}
}
} catch (err) { /*already logged error*/ }
this.notifyTaskDone();
}

View File

@@ -16,6 +16,7 @@ class TaskRestDial extends Task {
this.to = this.data.to;
this.call_hook = this.data.call_hook;
this.timeout = this.data.timeout || 60;
this.sipRequestWithinDialogHook = this.data.sipRequestWithinDialogHook;
this.on('connect', this._onConnect.bind(this));
this.on('callStatus', this._onCallStatus.bind(this));
@@ -64,7 +65,7 @@ class TaskRestDial extends Task {
const cs = this.callSession;
cs.setDialog(dlg);
this.logger.debug('TaskRestDial:_onConnect - call connected');
if (this.sipRequestWithinDialogHook) this._initSipRequestWithinDialogHandler(cs, dlg);
try {
const b3 = this.getTracingPropagation();
const httpHeaders = b3 && {b3};
@@ -142,6 +143,16 @@ class TaskRestDial extends Task {
this.logger.error({err}, 'Rest:dial:_onAmdEvent - error calling actionHook');
});
}
_initSipRequestWithinDialogHandler(cs, dlg) {
cs.sipRequestWithinDialogHook = this.sipRequestWithinDialogHook;
dlg.on('info', this._onRequestWithinDialog.bind(this, cs));
dlg.on('message', this._onRequestWithinDialog.bind(this, cs));
}
async _onRequestWithinDialog(cs, req, res) {
cs._onRequestWithinDialog(req, res);
}
}
module.exports = TaskRestDial;

View File

@@ -86,6 +86,13 @@ class TaskSay extends Task {
credentials.api_key = this.options.apiKey || credentials.apiKey;
credentials.region = this.options.region || credentials.region;
voice = this.options.voice || voice;
} else if (vendor === 'elevenlabs') {
credentials = credentials || {};
credentials.model_id = this.options.model_id || credentials.model_id;
credentials.voice_settings = this.options.voice_settings || {};
credentials.optimize_streaming_latency = this.options.optimize_streaming_latency
|| credentials.optimize_streaming_latency;
voice = this.options.voice_id || voice;
}
this.logger.info({vendor, language, voice, model}, 'TaskSay:exec');
@@ -127,6 +134,7 @@ class TaskSay extends Task {
model,
salt,
credentials,
options: this.options,
disableTtsCache : this.disableTtsCache
});
this.logger.debug(`file ${filePath}, served from cache ${servedFromCache}`);

View File

@@ -16,12 +16,14 @@ class SttTask extends Task {
normalizeTranscription,
removeSpeechListeners,
setSpeechCredentialsAtRuntime,
compileSonioxTranscripts
compileSonioxTranscripts,
consolidateTranscripts
} = require('../utils/transcription-utils')(logger);
this.setChannelVarsForStt = setChannelVarsForStt;
this.normalizeTranscription = normalizeTranscription;
this.removeSpeechListeners = removeSpeechListeners;
this.compileSonioxTranscripts = compileSonioxTranscripts;
this.consolidateTranscripts = consolidateTranscripts;
this.isHandledByPrimaryProvider = true;
if (this.data.recognizer) {
@@ -138,6 +140,44 @@ class SttTask extends Task {
addKey(key, evt.compiled_context, 3600 * 12)
.catch((err) => this.logger.info({err}, `Error caching cobalt context for ${key}`));
}
_doContinuousAsrWithDeepgram(asrTimeout) {
/* deepgram has an utterance_end_ms property that simplifies things */
assert(this.vendor === 'deepgram');
this.logger.debug(`_doContinuousAsrWithDeepgram - setting utterance_end_ms to ${asrTimeout}`);
const dgOptions = this.data.recognizer.deepgramOptions = this.data.recognizer.deepgramOptions || {};
dgOptions.utteranceEndMs = dgOptions.utteranceEndMs || asrTimeout;
}
_onVendorConnect(_cs, _ep) {
this.logger.debug(`TaskGather:_on${this.vendor}Connect`);
}
_onVendorError(cs, _ep, evt) {
this.logger.info({evt}, `${this.name}:_on${this.vendor}Error`);
const {writeAlerts, AlertType} = cs.srf.locals;
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.STT_FAILURE,
message: 'STT failure reported by vendor',
detail: evt.error,
vendor: this.vendor,
}).catch((err) => this.logger.info({err}, `Error generating alert for ${this.vendor} connection failure`));
this.notifyError({msg: 'ASR error', details:`Failed connecting to speech vendor ${this.vendor}: ${evt.error}`});
}
_onVendorConnectFailure(cs, _ep, evt) {
const {reason} = evt;
const {writeAlerts, AlertType} = cs.srf.locals;
this.logger.info({evt}, `${this.name}:_on${this.vendor}ConnectFailure`);
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.STT_FAILURE,
message: `Failed connecting to ${this.vendor} speech recognizer: ${reason}`,
vendor: this.vendor,
}).catch((err) => this.logger.info({err}, `Error generating alert for ${this.vendor} connection failure`));
this.notifyError({msg: 'ASR error', details:`Failed connecting to speech vendor ${this.vendor}: ${reason}`});
}
}
module.exports = SttTask;

View File

@@ -12,7 +12,7 @@ class TaskTag extends Task {
async exec(cs) {
super.exec(cs);
cs.callInfo.customerData = this.data;
//this.logger.debug({callInfo: cs.callInfo.toJSON()}, 'TaskTag:exec set customer data in callInfo');
this.logger.debug({customerData: cs.callInfo.customerData}, 'TaskTag:exec set customer data in callInfo');
}
}

View File

@@ -11,8 +11,9 @@ const {
IbmTranscriptionEvents,
NvidiaTranscriptionEvents,
JambonzTranscriptionEvents,
TranscribeStatus
} = require('../utils/constants');
TranscribeStatus,
AssemblyAiTranscriptionEvents
} = require('../utils/constants.json');
const { normalizeJambones } = require('@jambonz/verb-specifications');
const SttTask = require('./stt-task');
@@ -34,7 +35,9 @@ class TaskTranscribe extends SttTask {
// Continuos asr timeout
this.asrTimeout = typeof this.data.recognizer.asrTimeout === 'number' ? this.data.recognizer.asrTimeout * 1000 : 0;
this.isContinuousAsr = this.asrTimeout > 0;
if (this.asrTimeout > 0) {
this.isContinuousAsr = true;
}
/* buffer speech for continuous asr */
this._bufferedTranscripts = [];
}
@@ -177,6 +180,12 @@ class TaskTranscribe extends SttTask {
async _setSpeechHandlers(cs, ep, channel) {
if (this[`_speechHandlersSet_${channel}`]) return;
this[`_speechHandlersSet_${channel}`] = true;
/* some special deepgram logic */
if (this.vendor === 'deepgram') {
if (this.isContinuousAsr) this._doContinuousAsrWithDeepgram(this.asrTimeout);
}
const opts = this.setChannelVarsForStt(this, this.sttCredentials, this.data.recognizer);
switch (this.vendor) {
case 'google':
@@ -210,19 +219,19 @@ class TaskTranscribe extends SttTask {
this.bugname = 'nuance_transcribe';
ep.addCustomEventListener(NuanceTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
ep.addCustomEventListener(NuanceTranscriptionEvents.StartOfSpeech,
this._onStartOfSpeech.bind(this, cs, ep, channel));
ep.addCustomEventListener(NuanceTranscriptionEvents.TranscriptionComplete,
this._onTranscriptionComplete.bind(this, cs, ep, channel));
break;
case 'deepgram':
this.bugname = 'deepgram_transcribe';
ep.addCustomEventListener(DeepgramTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
ep.addCustomEventListener(DeepgramTranscriptionEvents.Connect,
this._onDeepgramConnect.bind(this, cs, ep, channel));
this._onVendorConnect.bind(this, cs, ep));
ep.addCustomEventListener(DeepgramTranscriptionEvents.ConnectFailure,
this._onDeepGramConnectFailure.bind(this, cs, ep, channel));
this._onVendorConnectFailure.bind(this, cs, ep, channel));
/* if app sets deepgramOptions.utteranceEndMs they essentially want continuous asr */
if (opts.DEEPGRAM_SPEECH_UTTERANCE_END_MS) this.isContinuousAsr = true;
break;
case 'soniox':
this.bugname = 'soniox_transcribe';
@@ -264,21 +273,25 @@ class TaskTranscribe extends SttTask {
ep.addCustomEventListener(IbmTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
ep.addCustomEventListener(IbmTranscriptionEvents.Connect,
this._onIbmConnect.bind(this, cs, ep, channel));
this._onVendorConnect.bind(this, cs, ep));
ep.addCustomEventListener(IbmTranscriptionEvents.ConnectFailure,
this._onIbmConnectFailure.bind(this, cs, ep, channel));
this._onVendorConnectFailure.bind(this, cs, ep, channel));
break;
case 'nvidia':
this.bugname = 'nvidia_transcribe';
ep.addCustomEventListener(NvidiaTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep));
ep.addCustomEventListener(NvidiaTranscriptionEvents.StartOfSpeech,
this._onStartOfSpeech.bind(this, cs, ep));
ep.addCustomEventListener(NvidiaTranscriptionEvents.TranscriptionComplete,
this._onTranscriptionComplete.bind(this, cs, ep));
ep.addCustomEventListener(NvidiaTranscriptionEvents.VadDetected,
this._onVadDetected.bind(this, cs, ep));
break;
case 'assemblyai':
this.bugname = 'assemblyai_transcribe';
ep.addCustomEventListener(AssemblyAiTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
ep.addCustomEventListener(AssemblyAiTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
ep.addCustomEventListener(AssemblyAiTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
ep.addCustomEventListener(AssemblyAiTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
break;
default:
@@ -286,9 +299,9 @@ class TaskTranscribe extends SttTask {
this.bugname = `${this.vendor}_transcribe`;
ep.addCustomEventListener(JambonzTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
ep.addCustomEventListener(JambonzTranscriptionEvents.Connect, this._onJambonzConnect.bind(this, cs, ep));
ep.addCustomEventListener(JambonzTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
ep.addCustomEventListener(JambonzTranscriptionEvents.ConnectFailure,
this._onJambonzConnectFailure.bind(this, cs, ep));
this._onVendorConnectFailure.bind(this, cs, ep));
break;
}
else {
@@ -329,8 +342,20 @@ class TaskTranscribe extends SttTask {
const bugname = fsEvent.getHeader('media-bugname');
if (bugname && this.bugname !== bugname) return;
if (this.vendor === 'ibm') {
if (evt?.state === 'listening') return;
if (this.vendor === 'ibm' && evt?.state === 'listening') return;
if (this.vendor === 'deepgram' && evt.type === 'UtteranceEnd') {
/* we will only get this when we have set utterance_end_ms */
if (this._bufferedTranscripts.length === 0) {
this.logger.debug('Gather:_onTranscription - got UtteranceEnd event from deepgram but no buffered transcripts');
}
else {
this.logger.debug('Gather:_onTranscription - got UtteranceEnd event from deepgram, return buffered transcript');
evt = this.consolidateTranscripts(this._bufferedTranscripts, 1, this.language);
this._bufferedTranscripts = [];
this._resolve('speech', evt);
}
return;
}
this.logger.debug({evt}, 'TaskTranscribe:_onTranscription - before normalization');
@@ -369,17 +394,6 @@ class TaskTranscribe extends SttTask {
}
}
_compileTranscripts() {
assert(this._bufferedTranscripts.length);
const evt = this._bufferedTranscripts[0];
let t = '';
for (const a of this._bufferedTranscripts) {
t += ` ${a.alternatives[0].transcript}`;
}
evt.alternatives[0].transcript = t.trim();
return evt;
}
async _resolve(channel, evt) {
/* we've got a transcript, so end the otel child span for this channel */
if (this.childSpan[channel - 1] && this.childSpan[channel - 1].span) {
@@ -467,78 +481,7 @@ class TaskTranscribe extends SttTask {
this._timer = null;
}
}
_onDeepgramConnect(_cs, _ep) {
this.logger.debug('TaskTranscribe:_onDeepgramConnect');
}
_onDeepGramConnectFailure(cs, _ep, channel, evt) {
const {reason} = evt;
const {writeAlerts, AlertType} = cs.srf.locals;
this.logger.info({evt}, 'TaskTranscribe:_onDeepgramConnectFailure');
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.STT_FAILURE,
message: `Failed connecting to Deepgram speech recognizer: ${reason}`,
vendor: 'deepgram',
}).catch((err) => this.logger.info({err}, 'Error generating alert for deepgram connection failure'));
this.notifyError(`Failed connecting to speech vendor deepgram: ${reason}`);
if (this.childSpan[channel - 1] && this.childSpan[channel - 1].span) {
this.childSpan[channel - 1].span.setAttributes({
channel,
'stt.resolve': 'connection failure'
});
this.childSpan[channel - 1].span.end();
}
this.notifyTaskDone();
}
_onJambonzConnect(_cs, _ep) {
this.logger.debug('TaskTranscribe:_onJambonzConnect');
}
_onJambonzConnectFailure(cs, _ep, evt) {
const {reason} = evt;
const {writeAlerts, AlertType} = cs.srf.locals;
this.logger.info({evt}, 'TaskTranscribe:_onJambonzConnectFailure');
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.STT_FAILURE,
message: `Failed connecting to ${this.vendor} speech recognizer: ${reason}`,
vendor: this.vendor,
}).catch((err) => this.logger.info({err}, 'Error generating alert for jambonz custom connection failure'));
this.notifyError({msg: 'ASR error', details:`Failed connecting to speech vendor ${this.vendor}: ${reason}`});
this.notifyTaskDone();
}
_onIbmConnect(_cs, _ep) {
this.logger.debug('TaskTranscribe:_onIbmConnect');
}
_onIbmConnectFailure(cs, _ep, channel, evt) {
const {reason} = evt;
const {writeAlerts, AlertType} = cs.srf.locals;
this.logger.info({evt}, 'TaskTranscribe:_onIbmConnectFailure');
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.STT_FAILURE,
message: `Failed connecting to IBM watson speech recognizer: ${reason}`,
vendor: 'ibm',
}).catch((err) => this.logger.info({err}, 'Error generating alert for IBM connection failure'));
this.notifyError(`Failed connecting to speech vendor IBM: ${reason}`);
if (this.childSpan[channel - 1] && this.childSpan[channel - 1].span) {
this.childSpan[channel - 1].span.setAttributes({
channel,
'stt.resolve': 'connection failure'
});
this.childSpan[channel - 1].span.end();
}
this.notifyTaskDone();
}
_onIbmError(cs, _ep, _channel, evt) {
this.logger.info({evt}, 'TaskTranscribe:_onIbmError');
}
async _onJambonzError(cs, _ep, evt) {
this.logger.info({evt}, 'TaskTranscribe:_onJambonzError');
if (this.isHandledByPrimaryProvider && this.fallbackVendor) {
@@ -576,12 +519,25 @@ class TaskTranscribe extends SttTask {
}
}
_onVendorConnectFailure(cs, _ep, channel, evt) {
super._onVendorConnectFailure(cs, _ep, evt);
if (this.childSpan[channel - 1] && this.childSpan[channel - 1].span) {
this.childSpan[channel - 1].span.setAttributes({
channel,
'stt.resolve': 'connection failure'
});
this.childSpan[channel - 1].span.end();
}
this.notifyTaskDone();
}
_startAsrTimer(channel) {
if (this.vendor === 'deepgram') return; // no need
assert(this.isContinuousAsr);
this._clearAsrTimer(channel);
this._asrTimer = setTimeout(() => {
this.logger.debug(`TaskTranscribe:_startAsrTimer - asr timer went off for channel: ${channel}`);
const evt = this._compileTranscripts();
const evt = this.consolidateTranscripts(this._bufferedTranscripts, channel, this.language);
this._bufferedTranscripts = [];
this._resolve(channel, evt);
}, this.asrTimeout);

View File

@@ -126,6 +126,12 @@
"Connect": "jambonz_transcribe::connect",
"Error": "jambonz_transcribe::error"
},
"AssemblyAiTranscriptionEvents": {
"Transcription": "assemblyai_transcribe::transcription",
"Error": "assemblyai_transcribe::error",
"ConnectFailure": "assemblyai_transcribe::connect_failed",
"Connect": "assemblyai_transcribe::connect"
},
"ListenEvents": {
"Connect": "mod_audio_fork::connect",
"ConnectFailure": "mod_audio_fork::connect_failed",

View File

@@ -27,6 +27,9 @@ WHERE pn.account_sid IS NULL
AND pn.service_provider_sid =
(SELECT service_provider_sid from accounts where account_sid = ?)
AND pn.number = ?`;
const sqlQueryGoogleCustomVoices = `SELECT *
FROM google_custom_voices
WHERE google_custom_voice_sid = ?`;
const speechMapper = (cred) => {
const {credential, ...obj} = cred;
@@ -88,6 +91,14 @@ const speechMapper = (cred) => {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.model_id = o.model_id;
obj.options = o.options;
} else if ('assemblyai' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
} else if ('whisper' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.model_id = o.model_id;
} else if (obj.vendor.startsWith('custom:')) {
const o = JSON.parse(decrypt(credential));
obj.auth_token = o.auth_token;
@@ -161,10 +172,22 @@ module.exports = (logger, srf) => {
}
};
const lookupGoogleCustomVoice = async(google_custom_voice_sid) => {
const pp = pool.promise();
try {
const [r] = await pp.query(sqlQueryGoogleCustomVoices, [google_custom_voice_sid]);
return r;
} catch (err) {
logger.error({err}, `lookupGoogleCustomVoices: Error ${google_custom_voice_sid}`);
}
};
return {
lookupAccountDetails,
updateSpeechCredentialLastUsed,
lookupCarrier,
lookupCarrierByPhoneNumber
lookupCarrierByPhoneNumber,
lookupGoogleCustomVoice
};
};

View File

@@ -8,9 +8,6 @@ const {
JAMBONES_MYSQL_CONNECTION_LIMIT,
JAMBONES_MYSQL_PORT,
JAMBONES_FREESWITCH,
JAMBONES_REDIS_HOST,
JAMBONES_REDIS_PORT,
JAMBONES_REDIS_SENTINELS,
SMPP_URL,
JAMBONES_TIME_SERIES_HOST,
JAMBONES_ESL_LISTEN_ADDRESS,
@@ -140,7 +137,8 @@ function installSrfLocals(srf, logger) {
lookupTeamsByAccount,
lookupAccountBySid,
lookupAccountCapacitiesBySid,
lookupSmppGateways
lookupSmppGateways,
lookupClientByAccountAndUsername
} = require('@jambonz/db-helpers')({
host: JAMBONES_MYSQL_HOST,
user: JAMBONES_MYSQL_USER,
@@ -174,19 +172,13 @@ function installSrfLocals(srf, logger) {
retrieveByPatternSortedSet,
sortedSetLength,
sortedSetPositionByPattern
} = require('@jambonz/realtimedb-helpers')(JAMBONES_REDIS_SENTINELS || {
host: JAMBONES_REDIS_HOST,
port: JAMBONES_REDIS_PORT || 6379
}, logger, tracer);
} = require('@jambonz/realtimedb-helpers')({}, logger, tracer);
const registrar = new Registrar(logger, client);
const {
synthAudio,
getNuanceAccessToken,
getIbmAccessToken,
} = require('@jambonz/speech-utils')(JAMBONES_REDIS_SENTINELS || {
host: JAMBONES_REDIS_HOST,
port: JAMBONES_REDIS_PORT || 6379
}, logger, tracer);
} = require('@jambonz/speech-utils')({redis_client: client}, logger);
const {
writeAlerts,
AlertType
@@ -217,6 +209,7 @@ function installSrfLocals(srf, logger) {
lookupAccountBySid,
lookupAccountCapacitiesBySid,
lookupSmppGateways,
lookupClientByAccountAndUsername,
updateCallStatus,
retrieveCall,
listCalls,

View File

@@ -13,6 +13,9 @@ const moment = require('moment');
const stripCodecs = require('./strip-ancillary-codecs');
const RootSpan = require('./call-tracer');
const uuidv4 = require('uuid-random');
const HttpRequestor = require('./http-requestor');
const WsRequestor = require('./ws-requestor');
const {makeOpusFirst} = require('./sdp-utils');
class SingleDialer extends Emitter {
constructor({logger, sbcAddress, target, opts, application, callInfo, accountInfo, rootSpan, startSpan, dialTask}) {
@@ -76,7 +79,8 @@ class SingleDialer extends Emitter {
...(this.from.host && {'X-Preferred-From-Host': this.from.host}),
'X-Jambonz-Routing': this.target.type,
'X-Call-Sid': this.callSid,
...(this.applicationSid && {'X-Application-Sid': this.applicationSid})
...(this.applicationSid && {'X-Application-Sid': this.applicationSid}),
...(this.target.proxy && {'X-SIP-Proxy': this.target.proxy})
};
if (srf.locals.fsUUID) {
opts.headers = {
@@ -148,7 +152,7 @@ class SingleDialer extends Emitter {
Object.assign(opts, {
proxy: `sip:${this.sbcAddress}`,
localSdp: this.ep.local.sdp
localSdp: opts.opusFirst ? makeOpusFirst(this.ep.local.sdp) : this.ep.local.sdp
});
if (this.target.auth) opts.auth = this.target.auth;
inviteSpan = this.startSpan('invite', {
@@ -176,6 +180,7 @@ class SingleDialer extends Emitter {
* (a) create a logger for this call
*/
req.srf = srf;
this.req = req;
this.callInfo = new CallInfo({
direction: CallDirection.Outbound,
parentCallInfo: this.parentCallInfo,
@@ -326,10 +331,15 @@ class SingleDialer extends Emitter {
try {
// retrieve set of tasks
const json = await this.requestor.request('dial:confirm', confirmHook, this.callInfo.toJSON());
if (!json || (Array.isArray(json) && json.length === 0)) {
this.logger.info('SingleDialer:_executeApp: no tasks returned from confirm hook');
return;
}
const tasks = normalizeJambones(this.logger, json).map((tdata) => makeTask(this.logger, tdata));
// verify it contains only allowed verbs
const allowedTasks = tasks.filter((task) => {
return [
TaskPreconditions.None,
TaskPreconditions.StableCall,
TaskPreconditions.Endpoint
].includes(task.preconditions);
@@ -377,15 +387,35 @@ class SingleDialer extends Emitter {
this.dlg.linkedSpanId = this.rootSpan.traceId;
const rootSpan = new RootSpan('outbound-call', this.dlg);
const newLogger = logger.child({traceId: rootSpan.traceId});
//clone application from parent call with new requestor
//parrent application will be closed in case the parent hangup
const app = {...application};
if ('WS' === app.call_hook?.method ||
app.call_hook?.url.startsWith('ws://') || app.call_hook?.url.startsWith('wss://')) {
const requestor = new WsRequestor(logger, this.accountInfo.account.account_sid,
app.call_hook, this.accountInfo.account.webhook_secret);
app.requestor = requestor;
app.notifier = requestor;
app.call_hook.method = 'WS';
}
else {
app.requestor = new HttpRequestor(logger, this.accountInfo.account.account_sid,
app.call_hook, this.accountInfo.account.webhook_secret);
if (app.call_status_hook) app.notifier = new HttpRequestor(logger,
this.accountInfo.account.account_sid, app.call_status_hook,
this.accountInfo.account.webhook_secret);
else app.notifier = {request: () => {}, close: () => {}};
}
const cs = new AdultingCallSession({
logger: newLogger,
singleDialer: this,
application,
application: app,
callInfo: this.callInfo,
accountInfo: this.accountInfo,
tasks,
rootSpan
});
cs.req = this.req;
cs.exec().catch((err) => newLogger.error({err}, 'doAdulting: error executing session'));
return cs;
}

View File

@@ -12,6 +12,30 @@ const mergeSdpMedia = (sdp1, sdp2) => {
return sdpTransform.write(parsedSdp1);
};
const getCodecPlacement = (parsedSdp, codec) => parsedSdp?.media[0]?.rtp?.findIndex((e) => e.codec === codec);
const isOpusFirst = (sdp) => {
return getCodecPlacement(sdpTransform.parse(sdp), 'opus') === 0;
};
const makeOpusFirst = (sdp) => {
const parsedSdp = sdpTransform.parse(sdp);
// Find the index of the OPUS codec
const opusIndex = getCodecPlacement(parsedSdp, 'opus');
// Move OPUS codec to the beginning
if (opusIndex > 0) {
const opusEntry = parsedSdp.media[0].rtp.splice(opusIndex, 1)[0];
parsedSdp.media[0].rtp.unshift(opusEntry);
// Also move the corresponding payload type in the "m" line
const opusPayloadType = parsedSdp.media[0].payloads.split(' ')[opusIndex];
const otherPayloadTypes = parsedSdp.media[0].payloads.split(' ').filter((pt) => pt != opusPayloadType);
parsedSdp.media[0].payloads = [opusPayloadType, ...otherPayloadTypes].join(' ');
}
return sdpTransform.write(parsedSdp);
};
const extractSdpMedia = (sdp) => {
const parsedSdp1 = sdpTransform.parse(sdp);
if (parsedSdp1.media.length > 1) {
@@ -28,5 +52,7 @@ const extractSdpMedia = (sdp) => {
module.exports = {
isOnhold,
mergeSdpMedia,
extractSdpMedia
extractSdpMedia,
isOpusFirst,
makeOpusFirst
};

View File

@@ -8,8 +8,9 @@ const {
SonioxTranscriptionEvents,
NvidiaTranscriptionEvents,
CobaltTranscriptionEvents,
JambonzTranscriptionEvents
} = require('./constants');
JambonzTranscriptionEvents,
AssemblyAiTranscriptionEvents
} = require('./constants.json');
const stickyVars = {
google: [
@@ -34,12 +35,14 @@ const stickyVars = {
'AZURE_SERVICE_ENDPOINT',
'AZURE_INITIAL_SPEECH_TIMEOUT_MS',
'AZURE_USE_OUTPUT_FORMAT_DETAILED',
'AZURE_SPEECH_SEGMENTATION_SILENCE_TIMEOUT_MS'
],
deepgram: [
'DEEPGRAM_SPEECH_KEYWORDS',
'DEEPGRAM_API_KEY',
'DEEPGRAM_SPEECH_TIER',
'DEEPGRAM_SPEECH_MODEL',
'DEEPGRAM_SPEECH_ENABLE_SMART_FORMAT',
'DEEPGRAM_SPEECH_ENABLE_AUTOMATIC_PUNCTUATION',
'DEEPGRAM_SPEECH_PROFANITY_FILTER',
'DEEPGRAM_SPEECH_REDACT',
@@ -50,6 +53,7 @@ const stickyVars = {
'DEEPGRAM_SPEECH_SEARCH',
'DEEPGRAM_SPEECH_REPLACE',
'DEEPGRAM_SPEECH_ENDPOINTING',
'DEEPGRAM_SPEECH_UTTERANCE_END_MS',
'DEEPGRAM_SPEECH_VAD_TURNOFF',
'DEEPGRAM_SPEECH_TAG'
],
@@ -101,9 +105,60 @@ const stickyVars = {
soniox: [
'SONIOX_PROFANITY_FILTER',
'SONIOX_MODEL'
],
assemblyai: [
'ASSEMBLYAI_API_KEY',
'ASSEMBLYAI_WORD_BOOST'
]
};
const consolidateTranscripts = (bufferedTranscripts, channel, language) => {
if (bufferedTranscripts.length === 1) return bufferedTranscripts[0];
let totalConfidence = 0;
const finalTranscript = bufferedTranscripts.reduce((acc, evt) => {
totalConfidence += evt.alternatives[0].confidence;
let newTranscript = evt.alternatives[0].transcript;
// If new transcript consists only of digits, spaces, and a trailing comma or period
if (newTranscript.match(/^[\d\s]+[,.]?$/)) {
newTranscript = newTranscript.replace(/\s/g, ''); // Remove all spaces
if (newTranscript.endsWith(',')) {
newTranscript = newTranscript.slice(0, -1); // Remove the trailing comma
} else if (newTranscript.endsWith('.')) {
newTranscript = newTranscript.slice(0, -1); // Remove the trailing period
}
}
const lastChar = acc.alternatives[0].transcript.slice(-1);
const firstChar = newTranscript.charAt(0);
if (lastChar.match(/\d/) && firstChar.match(/\d/)) {
acc.alternatives[0].transcript += newTranscript;
} else {
acc.alternatives[0].transcript += ` ${newTranscript}`;
}
return acc;
}, {
language_code: language,
channel_tag: channel,
is_final: true,
alternatives: [{
transcript: ''
}]
});
finalTranscript.alternatives[0].confidence = bufferedTranscripts.length === 1 ?
bufferedTranscripts[0].alternatives[0].confidence :
totalConfidence / bufferedTranscripts.length;
finalTranscript.alternatives[0].transcript = finalTranscript.alternatives[0].transcript.trim();
finalTranscript.vendor = {
name: 'deepgram',
evt: bufferedTranscripts
};
return finalTranscript;
};
const compileSonioxTranscripts = (finalWordChunks, channel, language) => {
const words = finalWordChunks.flat();
const transcript = words.reduce((acc, word) => {
@@ -161,7 +216,7 @@ const normalizeSoniox = (evt, channel, language) => {
};
};
const normalizeDeepgram = (evt, channel, language) => {
const normalizeDeepgram = (evt, channel, language, shortUtterance) => {
const copy = JSON.parse(JSON.stringify(evt));
const alternatives = (evt.channel?.alternatives || [])
.map((alt) => ({
@@ -169,10 +224,14 @@ const normalizeDeepgram = (evt, channel, language) => {
transcript: alt.transcript,
}));
/**
* note difference between is_final and speech_final in Deepgram:
* https://developers.deepgram.com/docs/understand-endpointing-interim-results
*/
return {
language_code: language,
channel_tag: channel,
is_final: evt.is_final,
is_final: shortUtterance ? evt.is_final : evt.speech_final,
alternatives: [alternatives[0]],
vendor: {
name: 'deepgram',
@@ -321,14 +380,32 @@ const normalizeAws = (evt, channel, language) => {
};
};
const normalizeAssemblyAi = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
return {
language_code: language,
channel_tag: channel,
is_final: evt.message_type === 'FinalTranscript',
alternatives: [
{
confidence: evt.confidence,
transcript: evt.text,
}
],
vendor: {
name: 'ASSEMBLYAI',
evt: copy
}
};
};
module.exports = (logger) => {
const normalizeTranscription = (evt, vendor, channel, language) => {
const normalizeTranscription = (evt, vendor, channel, language, shortUtterance) => {
//logger.debug({ evt, vendor, channel, language }, 'normalizeTranscription');
switch (vendor) {
case 'deepgram':
return normalizeDeepgram(evt, channel, language);
return normalizeDeepgram(evt, channel, language, shortUtterance);
case 'microsoft':
return normalizeMicrosoft(evt, channel, language);
case 'google':
@@ -345,6 +422,8 @@ module.exports = (logger) => {
return normalizeSoniox(evt, channel, language);
case 'cobalt':
return normalizeCobalt(evt, channel, language);
case 'assemblyai':
return normalizeAssemblyAi(evt, channel, language, shortUtterance);
default:
if (vendor.startsWith('custom:')) {
return normalizeCustom(evt, channel, language, vendor);
@@ -417,11 +496,12 @@ module.exports = (logger) => {
};
}
else if ('microsoft' === vendor) {
const {azureOptions = {}} = rOpts;
opts = {
...opts,
...(rOpts.hints && rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'string' &&
...(rOpts.hints?.length > 0 && typeof rOpts.hints[0] === 'string' &&
{AZURE_SPEECH_HINTS: rOpts.hints.map((h) => h.trim()).join(',')}),
...(rOpts.hints && rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'object' &&
...(rOpts.hints?.length > 0 && typeof rOpts.hints[0] === 'object' &&
{AZURE_SPEECH_HINTS: rOpts.hints.map((h) => h.phrase).join(',')}),
...(rOpts.altLanguages && rOpts.altLanguages.length > 0 &&
{AZURE_SPEECH_ALTERNATIVE_LANGUAGE_CODES: [...new Set(rOpts.altLanguages)].join(',')}),
@@ -435,6 +515,8 @@ module.exports = (logger) => {
...(rOpts.requestSnr && {AZURE_REQUEST_SNR: 1}),
...(rOpts.audioLogging && {AZURE_AUDIO_LOGGING: 1}),
...{AZURE_USE_OUTPUT_FORMAT_DETAILED: 1},
...(azureOptions.speechSegmentationSilenceTimeoutMs &&
{AZURE_SPEECH_SEGMENTATION_SILENCE_TIMEOUT_MS: azureOptions.speechSegmentationSilenceTimeoutMs}),
...(sttCredentials && {
...(sttCredentials.api_key && {AZURE_SUBSCRIPTION_KEY: sttCredentials.api_key}),
...(sttCredentials.region && {AZURE_REGION: sttCredentials.region}),
@@ -503,6 +585,8 @@ module.exports = (logger) => {
{DEEPGRAM_SPEECH_MODEL: deepgramOptions.model},
...(deepgramOptions.punctuate) &&
{DEEPGRAM_SPEECH_ENABLE_AUTOMATIC_PUNCTUATION: 1},
...(deepgramOptions.smartFormatting) &&
{DEEPGRAM_SPEECH_ENABLE_SMART_FORMAT: 1},
...(deepgramOptions.profanityFilter) &&
{DEEPGRAM_SPEECH_PROFANITY_FILTER: 1},
...(deepgramOptions.redact) &&
@@ -521,14 +605,16 @@ module.exports = (logger) => {
{DEEPGRAM_SPEECH_SEARCH: deepgramOptions.search.join(',')},
...(deepgramOptions.replace) &&
{DEEPGRAM_SPEECH_REPLACE: deepgramOptions.replace.join(',')},
...(rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'string' &&
...(rOpts.hints && rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'string' &&
{DEEPGRAM_SPEECH_KEYWORDS: rOpts.hints.map((h) => h.trim()).join(',')}),
...(rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'object' &&
...(rOpts.hints && rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'object' &&
{DEEPGRAM_SPEECH_KEYWORDS: rOpts.hints.map((h) => h.phrase).join(',')}),
...(deepgramOptions.keywords) &&
{DEEPGRAM_SPEECH_KEYWORDS: deepgramOptions.keywords.join(',')},
...('endpointing' in deepgramOptions) &&
{DEEPGRAM_SPEECH_ENDPOINTING: deepgramOptions.endpointing},
...(deepgramOptions.utteranceEndMs) &&
{DEEPGRAM_SPEECH_UTTERANCE_END_MS: deepgramOptions.utteranceEndMs},
...(deepgramOptions.vadTurnoff) &&
{DEEPGRAM_SPEECH_VAD_TURNOFF: deepgramOptions.vadTurnoff},
...(deepgramOptions.tag) &&
@@ -542,9 +628,9 @@ module.exports = (logger) => {
...opts,
...(sttCredentials.api_key) &&
{SONIOX_API_KEY: sttCredentials.api_key},
...(rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'string' &&
...(rOpts.hints?.length > 0 && typeof rOpts.hints[0] === 'string' &&
{SONIOX_HINTS: rOpts.hints.join(',')}),
...(rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'object' &&
...(rOpts.hints?.length > 0 && typeof rOpts.hints[0] === 'object' &&
{SONIOX_HINTS: JSON.stringify(rOpts.hints)}),
...(typeof rOpts.hintsBoost === 'number' &&
{SONIOX_HINTS_BOOST: rOpts.hintsBoost}),
@@ -602,9 +688,9 @@ module.exports = (logger) => {
...(rOpts.diarization && rOpts.diarizationMaxSpeakers > 0 &&
{NVIDIA_DIARIZATION_SPEAKER_COUNT: rOpts.diarizationMaxSpeakers}),
...(rOpts.separateRecognitionPerChannel && {NVIDIA_SEPARATE_RECOGNITION_PER_CHANNEL: 1}),
...(rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'string' &&
...(rOpts.hints?.length > 0 && typeof rOpts.hints[0] === 'string' &&
{NVIDIA_HINTS: rOpts.hints.join(',')}),
...(rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'object' &&
...(rOpts.hints?.length > 0 && typeof rOpts.hints[0] === 'object' &&
{NVIDIA_HINTS: JSON.stringify(rOpts.hints)}),
...(typeof rOpts.hintsBoost === 'number' &&
{NVIDIA_HINTS_BOOST: rOpts.hintsBoost}),
@@ -625,20 +711,29 @@ module.exports = (logger) => {
{COBALT_SPEECH_HINTS: rOpts.hints.join(',')}),
...(rOpts.hints?.length > 0 && typeof rOpts.hints[0] === 'object' &&
{COBALT_SPEECH_HINTS: JSON.stringify(rOpts.hints)}),
...(rOpts.hints?.length > 0 && {COBALT_CONTEXT_TOKEN: cobaltOptions.contextToken || 'unk:default'}),
...(rOpts.hints?.length > 0 &&
{COBALT_CONTEXT_TOKEN: cobaltOptions.contextToken || 'unk:default'}),
...(cobaltOptions.metadata && {COBALT_METADATA: cobaltOptions.metadata}),
...(cobaltOptions.enableConfusionNetwork && {COBALT_ENABLE_CONFUSION_NETWORK: 1}),
...(cobaltOptions.compiledContextData && {COBALT_COMPILED_CONTEXT_DATA: cobaltOptions.compiledContextData}),
};
} else if ('assemblyai' === vendor) {
opts = {
...opts,
...(sttCredentials.api_key) &&
{ASSEMBLYAI_API_KEY: sttCredentials.api_key},
...(rOpts.hints?.length > 0 &&
{ASSEMBLYAI_WORD_BOOST: JSON.stringify(rOpts.hints)})
};
}
else if (vendor.startsWith('custom:')) {
let {options = {}} = rOpts;
const {auth_token, custom_stt_url} = sttCredentials;
options = {
...options,
...(rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'string' &&
...(rOpts.hints?.length > 0 && typeof rOpts.hints[0] === 'string' &&
{hints: rOpts.hints}),
...(rOpts.hints.length > 0 && typeof rOpts.hints[0] === 'object' &&
...(rOpts.hints?.length > 0 && typeof rOpts.hints[0] === 'object' &&
{hints: JSON.stringify(rOpts.hints)}),
...(typeof rOpts.hintsBoost === 'number' && {hintsBoost: rOpts.hintsBoost})
};
@@ -693,6 +788,10 @@ module.exports = (logger) => {
ep.removeCustomEventListener(JambonzTranscriptionEvents.ConnectFailure);
ep.removeCustomEventListener(JambonzTranscriptionEvents.Error);
ep.removeCustomEventListener(AssemblyAiTranscriptionEvents.Transcription);
ep.removeCustomEventListener(AssemblyAiTranscriptionEvents.Connect);
ep.removeCustomEventListener(AssemblyAiTranscriptionEvents.ConnectFailure);
};
const setSpeechCredentialsAtRuntime = (recognizer) => {
@@ -735,6 +834,7 @@ module.exports = (logger) => {
setChannelVarsForStt,
removeSpeechListeners,
setSpeechCredentialsAtRuntime,
compileSonioxTranscripts
compileSonioxTranscripts,
consolidateTranscripts
};
};

2823
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{
"name": "jambonz-feature-server",
"version": "0.8.4",
"version": "0.8.5",
"main": "app.js",
"engines": {
"node": ">= 10.16.0"
@@ -30,11 +30,11 @@
"@jambonz/db-helpers": "^0.9.1",
"@jambonz/http-health-check": "^0.0.1",
"@jambonz/mw-registrar": "^0.2.4",
"@jambonz/realtimedb-helpers": "^0.8.6",
"@jambonz/speech-utils": "^0.0.22",
"@jambonz/realtimedb-helpers": "^0.8.7",
"@jambonz/speech-utils": "^0.0.29",
"@jambonz/stats-collector": "^0.1.9",
"@jambonz/time-series": "^0.2.8",
"@jambonz/verb-specifications": "^0.0.37",
"@jambonz/verb-specifications": "^0.0.46",
"@opentelemetry/api": "^1.4.0",
"@opentelemetry/exporter-jaeger": "^1.9.0",
"@opentelemetry/exporter-trace-otlp-http": "^0.35.0",
@@ -47,8 +47,8 @@
"bent": "^7.3.12",
"debug": "^4.3.4",
"deepcopy": "^2.1.0",
"drachtio-fsmrf": "^3.0.27",
"drachtio-srf": "^4.5.29",
"drachtio-fsmrf": "^3.0.28",
"drachtio-srf": "^4.5.31",
"express": "^4.18.2",
"express-validator": "^7.0.1",
"ip": "^1.1.8",

View File

@@ -17,5 +17,6 @@ require('./config-test');
require('./queue-test');
require('./in-dialog-test');
require('./http-proxy-test');
require('./sdp-utils-test');
require('./remove-test-db');
require('./docker_stop');

26
test/sdp-utils-test.js Normal file
View File

@@ -0,0 +1,26 @@
const test = require('tape');
const {makeOpusFirst, isOpusFirst} = require('../lib/utils/sdp-utils');
const sdpTransform = require('sdp-transform');
test('test opus first', (t) => {
const sdp = 'v=0\r\no=- 3348584794228993675 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE 0\r\na=extmap-allow-mixed\r\na=msid-semantic: WMS caca8b77-5ae5-4e73-a4d5-de1fce930335\r\nm=audio 57088 UDP/TLS/RTP/SAVPF 111 63 9 0 8 13 110 126\r\nc=IN IP4 14.238.89.50\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=candidate:1401281302 1 udp 2122260223 10.231.36.146 57088 typ host generation 0 network-id 1 network-cost 10\r\na=candidate:2173263513 1 udp 1686052607 14.238.89.50 57088 typ srflx raddr 10.231.36.146 rport 57088 generation 0 network-id 1 network-cost 10\r\na=ice-ufrag:k5nc\r\na=ice-pwd:J0qwMs6HrIcFNZbDG5m8Kqpk\r\na=ice-options:trickle\r\na=fingerprint:sha-256 66:DE:9A:76:CE:11:2D:65:C4:08:C7:87:B4:90:7E:F1:8D:07:B9:F4:FF:E3:81:D7:E7:7D:C6:56:47:01:6E:55\r\na=setup:actpass\r\na=mid:0\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=extmap:2 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time\r\na=extmap:3 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid\r\na=sendrecv\r\na=msid:caca8b77-5ae5-4e73-a4d5-de1fce930335 52ad01f1-b1df-4b8e-a208-9201e98b6f7b\r\na=rtcp-mux\r\na=rtcp-fb:111 transport-cc\r\na=fmtp:111 minptime=10;useinbandfec=1\r\na=rtpmap:63 red/48000/2\r\na=fmtp:63 111/111\r\na=rtpmap:9 G722/8000\r\na=rtpmap:0 PCMU/8000\r\na=rtpmap:8 PCMA/8000\r\na=rtpmap:13 CN/8000\r\na=rtpmap:111 opus/48000/2\r\na=rtpmap:110 telephone-event/48000\r\na=rtpmap:126 telephone-event/8000\r\na=ssrc:3207459321 cname:4nyPJ6KXvseBUIhu\r\na=ssrc:3207459321 msid:caca8b77-5ae5-4e73-a4d5-de1fce930335 52ad01f1-b1df-4b8e-a208-9201e98b6f7b\r\n';
const opusSdp = makeOpusFirst(sdp);
const parsedSdp = sdpTransform.parse(opusSdp);
const opusIndex = parsedSdp.media[0].rtp.findIndex((entry) => entry.codec === 'opus');
t.ok(opusIndex === 0, 'succesffuly move opus to be first offer')
t.end();
});
test('test is opus first', (t) => {
const sdp = 'v=0\r\no=- 3348584794228993675 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE 0\r\na=extmap-allow-mixed\r\na=msid-semantic: WMS caca8b77-5ae5-4e73-a4d5-de1fce930335\r\nm=audio 57088 UDP/TLS/RTP/SAVPF 111 63 9 0 8 13 110 126\r\nc=IN IP4 14.238.89.50\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=candidate:1401281302 1 udp 2122260223 10.231.36.146 57088 typ host generation 0 network-id 1 network-cost 10\r\na=candidate:2173263513 1 udp 1686052607 14.238.89.50 57088 typ srflx raddr 10.231.36.146 rport 57088 generation 0 network-id 1 network-cost 10\r\na=ice-ufrag:k5nc\r\na=ice-pwd:J0qwMs6HrIcFNZbDG5m8Kqpk\r\na=ice-options:trickle\r\na=fingerprint:sha-256 66:DE:9A:76:CE:11:2D:65:C4:08:C7:87:B4:90:7E:F1:8D:07:B9:F4:FF:E3:81:D7:E7:7D:C6:56:47:01:6E:55\r\na=setup:actpass\r\na=mid:0\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=extmap:2 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time\r\na=extmap:3 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid\r\na=sendrecv\r\na=msid:caca8b77-5ae5-4e73-a4d5-de1fce930335 52ad01f1-b1df-4b8e-a208-9201e98b6f7b\r\na=rtcp-mux\r\na=rtpmap:111 opus/48000/2\r\na=rtcp-fb:111 transport-cc\r\na=fmtp:111 minptime=10;useinbandfec=1\r\na=rtpmap:63 red/48000/2\r\na=fmtp:63 111/111\r\na=rtpmap:9 G722/8000\r\na=rtpmap:0 PCMU/8000\r\na=rtpmap:8 PCMA/8000\r\na=rtpmap:13 CN/8000\r\na=rtpmap:110 telephone-event/48000\r\na=rtpmap:126 telephone-event/8000\r\na=ssrc:3207459321 cname:4nyPJ6KXvseBUIhu\r\na=ssrc:3207459321 msid:caca8b77-5ae5-4e73-a4d5-de1fce930335 52ad01f1-b1df-4b8e-a208-9201e98b6f7b\r\n';
t.ok(isOpusFirst(sdp), "opus is first offer");
const sdp2 = 'v=0\r\no=- 3348584794228993675 2 IN IP4 127.0.0.1\r\ns=-\r\nt=0 0\r\na=group:BUNDLE 0\r\na=extmap-allow-mixed\r\na=msid-semantic: WMS caca8b77-5ae5-4e73-a4d5-de1fce930335\r\nm=audio 57088 UDP/TLS/RTP/SAVPF 111 63 9 0 8 13 110 126\r\nc=IN IP4 14.238.89.50\r\na=rtcp:9 IN IP4 0.0.0.0\r\na=candidate:1401281302 1 udp 2122260223 10.231.36.146 57088 typ host generation 0 network-id 1 network-cost 10\r\na=candidate:2173263513 1 udp 1686052607 14.238.89.50 57088 typ srflx raddr 10.231.36.146 rport 57088 generation 0 network-id 1 network-cost 10\r\na=ice-ufrag:k5nc\r\na=ice-pwd:J0qwMs6HrIcFNZbDG5m8Kqpk\r\na=ice-options:trickle\r\na=fingerprint:sha-256 66:DE:9A:76:CE:11:2D:65:C4:08:C7:87:B4:90:7E:F1:8D:07:B9:F4:FF:E3:81:D7:E7:7D:C6:56:47:01:6E:55\r\na=setup:actpass\r\na=mid:0\r\na=extmap:1 urn:ietf:params:rtp-hdrext:ssrc-audio-level\r\na=extmap:2 http://www.webrtc.org/experiments/rtp-hdrext/abs-send-time\r\na=extmap:3 http://www.ietf.org/id/draft-holmer-rmcat-transport-wide-cc-extensions-01\r\na=extmap:4 urn:ietf:params:rtp-hdrext:sdes:mid\r\na=sendrecv\r\na=msid:caca8b77-5ae5-4e73-a4d5-de1fce930335 52ad01f1-b1df-4b8e-a208-9201e98b6f7b\r\na=rtcp-mux\r\na=rtcp-fb:111 transport-cc\r\na=fmtp:111 minptime=10;useinbandfec=1\r\na=rtpmap:63 red/48000/2\r\na=fmtp:63 111/111\r\na=rtpmap:9 G722/8000\r\na=rtpmap:0 PCMU/8000\r\na=rtpmap:8 PCMA/8000\r\na=rtpmap:13 CN/8000\r\na=rtpmap:111 opus/48000/2\r\na=rtpmap:110 telephone-event/48000\r\na=rtpmap:126 telephone-event/8000\r\na=ssrc:3207459321 cname:4nyPJ6KXvseBUIhu\r\na=ssrc:3207459321 msid:caca8b77-5ae5-4e73-a4d5-de1fce930335 52ad01f1-b1df-4b8e-a208-9201e98b6f7b\r\n';
t.ok(!isOpusFirst(sdp2), "opus is not first offer")
const sdp3 = 'v=0\r\no=xhoaluu2 1314 1504 IN IP4 192.168.1.4\r\ns=Talk\r\nc=IN IP4 192.168.1.4\r\nt=0 0\r\na=ice-pwd:397d063ea23fdc05164e3ee4\r\na=ice-ufrag:16c449a3\r\na=rtcp-xr:rcvr-rtt=all:10000 stat-summary=loss,dup,jitt,TTL voip-metrics\r\na=group:BUNDLE as\r\na=record:off\r\nm=audio 56542 RTP/AVPF 0 8\r\nc=IN IP4 14.226.233.151\r\na=rtcp-mux\r\na=mid:as\r\na=extmap:1 urn:ietf:params:rtp-hdrext:sdes:mid\r\na=rtcp:63076 IN IP4 192.168.1.4\r\na=candidate:1 1 UDP 2130706303 192.168.1.4 56542 typ host\r\na=candidate:1 2 UDP 2130706302 192.168.1.4 63076 typ host\r\na=candidate:2 1 UDP 2130706431 2001:ee0:d744:dcf0:c1d3:d73f:7a93:dc9f 56542 typ host\r\na=candidate:2 2 UDP 2130706430 2001:ee0:d744:dcf0:c1d3:d73f:7a93:dc9f 63076 typ host\r\na=candidate:3 1 UDP 2130706431 2001:ee0:d744:dcf0:15:6be3:8e6b:b736 56542 typ host\r\na=candidate:3 2 UDP 2130706430 2001:ee0:d744:dcf0:15:6be3:8e6b:b736 63076 typ host\r\na=candidate:4 1 UDP 1694498687 14.226.233.151 56542 typ srflx raddr 192.168.1.4 rport 56542\r\na=rtcp-fb:* trr-int 1000\r\na=rtcp-fb:* ccm tmmbr';
t.ok(!isOpusFirst(sdp2), "opus is not first offer")
t.end();
});