Compare commits

...

305 Commits

Author SHA1 Message Date
Ed Robbins
1a1f53aede Compare sdp to determine if transcoding is being used. (#1444)
* compare sdp for transcoding

* refactor sdp check for leading codec

* fix reference to epOther

* minor changes

* minor

* fix #1447

* fix security issue

* use convenience getter appIsUsingWebsockets in CallSession

---------

Co-authored-by: Dave Horton <daveh@beachdognet.com>
2025-11-24 10:50:41 -06:00
Hoan Luu Huu
1984b6d3ea allow say verb failed as NonFatalTaskError for File Not Found (#1443)
* allow say verb failed as NonFatalTaskError for File Not Found

* wip
2025-11-20 07:22:28 -05:00
Hoan Luu Huu
769b66f57e fixed playbackIds is not in correct order compare with say.text array (#1439)
* fixed playbackIds is not in correct order compare with say.text array

* wip

* wip
2025-11-19 19:00:44 -05:00
Hoan Luu Huu
98b845f489 fix say verb does not close streaming when finish say (#1412)
* fix say verb does not close streaming when finish say

* wip

* wip

* ttsStreamingBuffer reset eventHandlerCount after remove listeners

* only send tokens to module if connected

* wip

* sent stream_open when successfully connected to vendor
2025-11-17 08:56:09 -05:00
Ed Robbins
f92b1dbc97 Add ability to override certain tts streaming options via the config … (#1429)
* Add ability to override certain tts streaming options via the config verb.

* Update to null operator(??), support parameter override via config
2025-11-12 13:54:01 -05:00
Dave Horton
0442144793 fix bug escaping backspace character 2025-11-03 15:33:59 -05:00
Hoan Luu Huu
2de24af169 fixed gather does not start timeout on bargin (#1421)
* fixed gather does not start timeout on bargin

* with previous change, no need to emit playDone since no where in the code are we listening for it

---------

Co-authored-by: Dave Horton <daveh@beachdognet.com>
2025-11-03 13:11:59 -05:00
Dave Horton
a884880321 fix for #1422 (#1423)
* fix for #1422

* fix prev commit
2025-11-03 12:53:43 -05:00
Dave Horton
b307df79d0 update deps (#1417) 2025-10-31 07:31:32 -04:00
Hoan Luu Huu
77bd11dd47 update speech util 0.2.26 (#1416) 2025-10-31 07:14:38 -04:00
Hoan Luu Huu
46d56fe546 fd_1574: should not send only whitespace to streaming tts engine (#1415) 2025-10-30 20:59:25 -04:00
Hoan Luu Huu
30ab281ea2 support disableTtsCache from config verb (#1410) 2025-10-28 08:19:03 -04:00
Sam Machin
0869a73052 add distributeDtmf to conference (#1401)
* add distributeDtmf to conference

* lint

* bump verb specs
2025-10-21 11:20:12 -04:00
Sam Machin
a0a579ccee escape json special chars in metadata (#1399) 2025-10-20 10:30:03 -04:00
Sam Machin
4218653852 add customerData on transferred calls (#1391)
* add customerData on transferred calls

* change to if statement
2025-10-20 09:20:12 -04:00
Hoan Luu Huu
89cc39f726 support gladia stt (#1397)
* support gladia stt

* wip

* wip

* update verb specification
2025-10-20 04:56:39 -04:00
Sam Machin
b231593bff bump dbhelpers for cache change (#1396) 2025-10-15 11:38:43 -04:00
Sam Machin
4309d25376 don't encode querystring if its the filename (#1395)
* don't encode querystring if its the filename

* lint

* update link to issue

u
2025-10-14 10:48:50 -04:00
Hoan Luu Huu
a00703a067 support houndify stt (#1364)
* support houndify stt

* wip

* wip

* wip

* update houndify stt parameters

* wip

* wip
2025-10-14 00:55:21 -04:00
Hoan Luu Huu
89c985b564 fixed does not send final status call back if call canceled quickly (#1393)
* fixed callsession should cleanup resource if call was canceled while fetching app

* wip

* wip

* wip

* wip

* wip
2025-10-11 03:44:42 -04:00
Dave Horton
b4ed4c8c46 #1385: Gather - dont start the continuous asr timer when we first start listening if this is a background gather (#1386) 2025-10-09 08:47:51 -04:00
Hoan Luu Huu
581d309f36 support elevenlabs different endpoint (#1387)
* support elevenlabs different endpoint

* wip

* wip
2025-10-09 08:19:40 -04:00
Sam Machin
d1baf2fe37 if call is transferred from another FS then always answer (#1383)
Currently if the call being transferred was originally an outbound call then the direction thats retrieved from redis is outbound and the invite of the refer from the other FS is never answered,
However a transferredCall will always need to be answered regardless of CallDirection
2025-10-07 07:19:11 -04:00
Dave Horton
28bf0d3477 send eager_eot events (#1382) 2025-10-06 16:50:20 -04:00
Dave Horton
d2d3b4583e Fix/flux cleanup (#1379)
* for deepgram flux include the turn taking events in the transcription payload

* for deepgram flux, including turn_taking_event in the speech payload

* fix prev commit which used wrong field
2025-10-04 20:06:38 -04:00
Hoan Luu Huu
854c26db11 support deepgramflux (#1373)
* support deepgramflux

* wip

* wip

* wip

* wip

* update verb scpecification
2025-10-03 10:38:39 -04:00
Dave Horton
e77666a1a7 update speech-utils and drachtio-srf (#1377) 2025-10-03 10:06:37 -04:00
Ed Robbins
5acb19225b If an error occurs during initial TTS request, propagate the error (#1369)
* If an error occurs during initial TTS request, propagate the error

* fix missing semicolon

* fix jslint error

* add null check to r.playbackMilliseconds
2025-10-01 00:02:51 -04:00
Dave Horton
1d6f84c2d7 add event handler for when deepgram closes with an error (#1372) 2025-09-28 14:18:56 -04:00
Vinod Dharashive
de9b970a93 Update example-voicemail-greetings.json (#1367)
Add JP greetin
2025-09-23 09:30:40 -04:00
rammohan-y
ec786ef1dd Fix for sending synthesized-audio verb:status event when using TTS streaming (#1366)
https://github.com/jambonz/jambonz-feature-server/issues/1365
2025-09-23 09:30:05 -04:00
Sam Machin
a95a6d1683 clear main timout when interdigit timeout is started, (#1351)
* clear main timout when interdigit timeout is started,

also clear the asrTimeout when dtmf has taken priority

* lint

* lint
2025-09-12 09:01:51 -04:00
Dave Horton
65b3066866 catch exceptions from req.cancel() (#1359)
* catch exceptions from req.cancel()

* catch other instances of req.cancel

* fix prev commit
2025-09-11 12:25:36 -04:00
Hoan Luu Huu
057f52e56c speech_util v0.2.23 (#1358) 2025-09-11 01:30:31 -04:00
Hoan Luu Huu
b46be57eba singleDialer should create ConfirmCallSession with correct tmpFiles (#1357) 2025-09-10 22:37:41 -04:00
Hoan Luu Huu
f950d19d1c fix ConfirmCallSession in placeCall does not have access to tmpFiles for removing tmp file later (#1356) 2025-09-10 19:39:43 -04:00
pk32495
859132bb1c Fixed token missing log line. (#1354) 2025-09-10 15:03:56 -04:00
Dave Horton
acaadceaa2 fix exception when receiving REFER but dial task ended (#1353) 2025-09-10 12:23:32 -04:00
Hoan Luu Huu
add8d63e8e tts stream should not print speech credential (#1352) 2025-09-09 19:18:49 -04:00
Sam Machin
a05b72a420 Fix/1345 (#1349)
* don't try and guess carrier if LCR is set

* lint

* update dbHelpers dep
2025-09-06 14:15:16 -04:00
rammohan-y
28ff85225f Fixed issue for punctuation (#1344)
https://github.com/jambonz/jambonz-feature-server/issues/1343
2025-09-03 13:33:38 -04:00
Dave Horton
f2fe7c4d24 Fix/playback race by fs generates playback (#1331)
* update to speech-utils that generates playback id

* modify tts and say task to track current playback id and match against start and stop events

* bump speech utils

* wip

* wip

* fix race condition where say with playbackId gets stop event from previous play from cache file

* logging

* wip

* fix comparison when playing cached files

* logging
2025-08-26 09:39:25 -04:00
Dave Horton
97408c7d3b fix uncaught exception with llm streaming 2025-08-22 13:24:59 -04:00
Dave Horton
db5f0a0dce Feat/startup logging (#1333)
* turn down some logging

* add startup logging

* wip

* lint
2025-08-21 14:09:20 -04:00
Sam Machin
654ccd9d9d start timeout on bargein digits (#1312) 2025-08-19 08:34:33 -04:00
Dave Horton
ea27b20ac5 update speech-utils (#1324) 2025-08-17 10:09:55 -04:00
Hoan Luu Huu
96aa705378 update speech util verbsion (#1323) 2025-08-14 08:39:12 -04:00
rammohan-y
5e51849839 Sending synthesized-audio notification for servedFromCache as false (#1320)
* Sending synthesized-audio notification for servedFromCache as well
https://github.com/jambonz/jambonz-feature-server/issues/1319

* Sending back the id that was set, to track the synthesized-audio
e.g if we send a say verb having 100, it's synthesized-audio event will return 100 in the data to correleate the say verb and synthesized-audio event
2025-08-13 20:56:56 -04:00
Hoan Luu Huu
44f69fa76d Support resemble tts (#1322)
* support resemble tts

* update speech utils version
2025-08-13 08:15:29 -04:00
Dave Horton
73c77bea71 add support for deepgram entity_prompt 2025-08-11 20:58:14 -04:00
rammohan-y
babc0d0dbb Fix for issue https://github.com/jambonz/jambonz-feature-server/issues/1317. (#1318)
Unable to use mod_aws_transcribe module due to security error as sessionId is not populated
2025-08-11 09:17:32 -04:00
Hoan Luu Huu
6b8d0fe1a8 update db-helper 0.9.16 (#1316) 2025-08-08 10:43:59 -04:00
Dave Horton
66bb466297 fix bug where task.kill is not passed cs (#1315) 2025-08-07 16:01:45 -04:00
Dave Horton
1933f4ec0b Feat/freeswitch logging (#1309)
* include callSid on INVITEs to freeeswitch

* remove unnecessary warning
2025-08-04 09:19:47 -04:00
Sam Machin
b1089a1ae9 pass recogniser opts in amd to stt (#1308) 2025-08-01 22:26:51 -04:00
sathish kumar pasham
93e06d887e Fix security vulnerabilities by upgrading @jambonz/realtimedb-helpers (#1305) 2025-07-30 13:24:19 -04:00
Sam Machin
b478e0ecd2 fix assert, and force methods to upper case (#1304)
* fix assert, and force methods to upper case

* add alert for updateCall errors

* lint

* handle missing method
2025-07-30 08:32:15 -04:00
Sam Machin
94d43d4b70 use tmpFiles list of parent call-session (#1301)
fixes #1299
2025-07-29 22:08:00 -04:00
Hoan Luu Huu
eb449e9169 support deepgram river (#1273)
* support deepgram river

* wip

* rebase

* fix review comment

---------

Co-authored-by: Dave Horton <daveh@beachdognet.com>
2025-07-29 13:49:43 -04:00
Hoan Luu Huu
158d9d7d25 support stt latency metrics (#1252)
* support stt latency metrics

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* enable stt latency calculator by config verb

* wip

* wip

* wip

* fix jslint

* fixed gather timeout does not have latency calculation

* upadte verb specification to use notifySttLatency

* move stt latency metric from call session to stt-latency calculator

* wip
2025-07-29 09:56:37 -04:00
Hoan Luu Huu
5886d1d945 allow pause/resume background listen with silence/blank (#1300)
* allow pause/resume background listen with silence/blank

* wip

* wip

* wip

* wip

* update drachtio-fsmrf version
2025-07-28 07:58:30 -04:00
Hoan Luu Huu
352106ec0c support referTo with tel: prefix (#1297)
* support referTo with tel: prefix

* fix review comment
2025-07-23 10:52:10 -04:00
Sam Machin
05a6bf51a7 corrected typos (#1295)
🧙‍♀️
2025-07-21 07:22:51 -04:00
Sam Machin
bd1c763e72 urlencode the . in a url querystring on play (#1293)
* urlencode the . in a url querystring on play

* lint

* fix for non querystring urls too

* lint
2025-07-20 14:03:21 -04:00
Sam Machin
d831a4ca7f don't fetch if whisper is an object with a single verb in it (#1290)
* don't fetch if whisper is an object with a single verb in it

* disable URL fetching of verbs on whisper

* fixes for lint

* lint
2025-07-20 13:47:31 -04:00
Sam Machin
0cc6ea987f Fix/1277 (#1278)
* wip

* forwardPAI control in verb

* update deps

* package-loxck

* Update package-lock.json
2025-07-20 13:26:33 -04:00
rammohan-y
6e7521c91c Fix for issue where we should not delay from gather when JAMBONES_TRANSCRIBE_EP_DESTROY_DELAY_MS is set (#1283)
* Fix for issue https://github.com/jambonz/jambonz-feature-server/issues/1281

* Commented the reason for the change
2025-07-17 10:36:22 -04:00
Hoan Luu Huu
e0e2ade289 fixed gather cannot timeout if listenDuringPrompt is true (#1276)
* fixed gather cannot timeout if listenDuringPrompt is true

* wip

* wip

* wip
2025-07-15 22:53:22 -04:00
Dave Horton
ee895b4046 bump version 2025-07-15 11:40:14 -04:00
Dave Horton
2a42ccb0e1 fix for #1285 (#1286)
* fix for #1285

* typo
2025-07-15 11:38:48 -04:00
Dave Horton
62b6a814b7 fixes for LCC dial where a relative url is given as actionhook (#1282)
* fixes for LCC dial where a relative url is given as actionhook

* update to speech-utils 0.2.15 with configurable tmp folder location
2025-07-13 11:08:15 -04:00
Dave Horton
e415420150 Fix/cached audio race (#1279)
* fix for race condition when play-stop event from earlier command received

* wip

* say verb should not cache if disableTtsCache = true

* logging

* modify race condition logic to validate playback id in playback-stopped matches that from playback-start

* logging

---------

Co-authored-by: Quan HL <quan.luuhoang8@gmail.com>
2025-07-13 10:56:41 -04:00
Sam Machin
e6e039e0f2 add alert verb (#1270)
* add alert verb

* update dependencies

* Update package-lock.json

* remove await taskDone
2025-07-10 07:39:42 -04:00
Dave Horton
657e2d4a49 update speech-utils (#1275) 2025-07-09 16:10:08 -04:00
Hoan Luu Huu
337c1cded0 fixed transcription is not received when call is terminated (#1259)
* fixed transcription is not received when call is terminated

* wip

* fixed failing testcases

* wip

* wip

* wip

* wip

* should not do gracefulshutdown on stopAmd
2025-07-09 10:20:09 -04:00
Dave Horton
444abcb036 (fd 1056) if no payload returned from referHook in dial then dont do anything (#1271) 2025-07-07 12:50:46 -04:00
Hoan Luu Huu
c82a835e70 FD_1079: tts modules response_code = 0 should make say fail to exec. (#1269)
* FD_1079: tts modules response_code = 0 should make say fail to exec.

* fixed tts azure does not have response code

* fixed tts error does not raise alarm

* wip

* fixed
2025-07-07 07:59:29 -04:00
Hoan Luu Huu
3c185d4bd2 consistent assemblyAiOptions name (#1267)
* consistent assemblyAiOptions  name

* update verb specification version
2025-07-03 08:05:56 -04:00
Hoan Luu Huu
ba2049b705 support assemblyai v3 (#1265)
* support assemblyai v3

* wip

* wip

* wip

* wip

* wip

* wip
2025-07-01 15:46:19 -04:00
Dave Horton
7691af30de Fix/dial refer (#1264)
* Revert "Update dial.js (#1243)"

This reverts commit 259dedcded.

* add to .gitignore

* when we receive a REFER on the parent leg, after adulting the child the dial task in the parent session should end
2025-06-28 15:01:09 -04:00
Hoan Luu Huu
ab83b21979 support inworld tts (#1262)
* support inworld tts

* wip

* wip
2025-06-27 10:05:18 -04:00
Hoan Luu Huu
f18b62e165 support ultravox agent id (#1254)
* support ultravox agent id

* wip
2025-06-23 09:37:10 -04:00
Dave Horton
f98bf2a1f8 update to drachtio-fsmrf@4.0.4 (#1255) 2025-06-22 15:18:30 +02:00
Hoan Luu Huu
8c67c05d87 fixed bargin task loop forever (#1253) 2025-06-22 06:20:24 +02:00
Hoan Luu Huu
3f11ee58a7 fixed dub playOnTrack loop forever (#1247) 2025-06-18 21:37:32 +02:00
rammohan-y
c8d94026ff Removing video sdp when making an outbound call (#1242) 2025-06-18 21:31:00 +02:00
Hoan Luu Huu
5be6c54339 support mod_cartesia_transcribe (#1245) 2025-06-17 20:54:26 +02:00
Sam Machin
259dedcded Update dial.js (#1243) 2025-06-13 18:23:40 +02:00
Dave Horton
b70fea69cc Fix/dial unhandled rejection (#1239)
* fix bug with race condition in dial which could spike cpu due to unhandled exception

* update to 0.2.12 speech-utils with azure ssml fix

* minor
2025-06-11 11:40:51 +02:00
leedia-tech
2bea7e83e1 Update example-voicemail-greetings.json: included Italian (it-IT) voicemail response patterns based on common operator messages (#1226) 2025-06-11 11:03:51 +02:00
Rohan Sreerama
812076d4fe fix(create-call): fix url parsing (#1235) 2025-06-11 11:01:44 +02:00
Hoan Luu Huu
b0b74871e7 support say stream with text (#1227)
* support say stream with text

* wip

* wip

* wip

* wip

* update verb  specification
2025-06-10 16:56:44 +02:00
Hoan Luu Huu
29708a1f7c clear log from ws-requestor (#1238)
* clear log from ws-requestor

* wip

* wip
2025-06-10 10:34:33 +02:00
Rohan Sreerama
e686a11808 fix(play): fixes ep null issue (#1233)
* fix(play): fixes ep null issue

* chore(play): fixing formatting

* chore(play): update syntax

* chore(play): updates

* fix(play): updated fix

* fix(play): updated fix

---------

Co-authored-by: rsreerama3 <rsreerama3@gatech.edu>
2025-06-05 14:57:35 -04:00
Dave Horton
25f58d2e43 fixes #1230 (#1231) 2025-06-04 13:42:49 -04:00
Dave Horton
8e9ab83ca4 Fix/env vars rest dial (#1225)
* rest dial needs to support env vars

* wip
2025-05-30 10:14:44 -04:00
rammohan-y
e975511df5 Fix for play issue (#1223)
See https://github.com/jambonz/jambonz-feature-server/issues/1222
2025-05-29 10:32:03 -04:00
Hoan Luu Huu
4386df993c add retry for http/ws requestor (#1210)
* add retry for http requestor

* fix failing testcase

* wip

* update ws-requestor

* wip

* wip

* wip
2025-05-29 10:17:49 -04:00
Dave Horton
f20d3dba2f bump speech-utils 2025-05-28 13:13:16 -04:00
Dave Horton
b734952855 proper handling of case when amd fails because no speech credentials (#1220) 2025-05-28 13:01:08 -04:00
Hoan Luu Huu
4990b1fb68 fix mod dub throw undefined reference (#1209) 2025-05-28 09:58:33 -04:00
Hoan Luu Huu
3475f39b1d support deepgram model_id in speechCredential (#1213) 2025-05-28 08:00:56 -04:00
Hoan Luu Huu
690a7fcd55 add ws msgid to telemetry span if it's ws requestor (#1215)
* add ws msgid to telemetry span if it's ws requestor

* wip
2025-05-28 07:51:59 -04:00
Dave Horton
760394aa5e Revert "fix: member mute on conference (#1048)" (#1216)
This reverts commit 761b7f26e7.
2025-05-28 07:21:24 -04:00
Dave Horton
603dd482bc add a bit more info logging to gather (#1211) 2025-05-26 09:34:21 -04:00
Dave Horton
f670626cf7 google gemini set default model to models/gemini-2.0-flash-live-001 2025-05-22 08:12:58 -04:00
Hoan Luu Huu
b92a9c700e fixed google s2s mcp initiate wrong functionDeclarations (#1203)
* fixed google s2s mcp initiate wrong functionDeclarations

* populate model from llm verb to google setup message
2025-05-18 20:15:26 -04:00
Anton Voylenko
761b7f26e7 fix: member mute on conference (#1048) 2025-05-18 14:11:56 -04:00
Dave Horton
76df58bfc2 fix logging in start task msg (#1202)
* fix logging in start task msg

* generate uuids using native crypto lib
2025-05-16 16:54:25 -04:00
Dave Horton
c1cb57c3f6 update version 2025-05-14 15:38:15 -04:00
Dave Horton
610c9af274 update db-helpers 2025-05-13 10:32:30 -04:00
Dave Horton
c0a35af591 update to 0.2.10 speech-utils (#1199) 2025-05-13 10:11:26 -04:00
Hoan Luu Huu
9585018147 support whisper instructions (#1198)
* support whisper instructions

* wip

* update speech utils and verb specification
2025-05-13 09:44:00 -04:00
Hoan Luu Huu
d7884a837a update deepgram voice agent (#1191)
* update deepgram voice agent

* fix lint

* wip

* wip
2025-05-13 07:43:48 -04:00
Dave Horton
ca0bf36815 dont apply snake casing to either env vars or tool call args (#1194) (#1197) 2025-05-12 12:56:58 -04:00
Sam Machin
6b68d32e2c end if last_word_end is -1 (#1196)
* end if last_word_end is -1

* lint
2025-05-12 12:11:32 -04:00
rammohan-y
8217a76697 Removed this.name from Task constructor, as LLM's names are populated post calling the base construction (#1192)
Also fixed a jslint error
2025-05-12 09:14:33 -04:00
rammohan-y
5c8237b382 Feat 1179 race issue with play verb (#1183)
* Fixed race issue between queueCommand false and queueCommand true when play task is involved

https://github.com/jambonz/jambonz-feature-server/issues/1179

* removed unnecessary emitter

* added destroy mechanism for stickyEventEmitter

* clearing stickyEventEmitter

* memory leak fix
2025-05-11 20:25:48 -04:00
Vasudev Anubrolu
4ff5c845de feat/864 update speech utils for playht on prem (#1187)
* feat/864 update speech utils for playht on prem

* feat/864 update speech utils version package lock
2025-05-09 12:34:14 -04:00
Anton Voylenko
78ebd08490 feat: prioritize JAMBONES_LOGLEVEL over db setting (#1188) 2025-05-09 09:41:23 -04:00
Hoan Luu Huu
8b18532f31 fixed tts streaming buffer cannot reset timeoutwhen lastUpdateTime is short (#1184)
* fixed tts streaming buffer cannot reset timeoutwhen lastUpdateTime is short

* wip
2025-05-07 10:26:11 -04:00
rammohan-y
e4bb00b382 Send stop-playback event (#1186)
* Send stop-playback event

https://github.com/jambonz/jambonz-feature-server/issues/1185

* check if not notified in playback-stop, ensure that the stop-playback is sent when kill-playback is sent
2025-05-07 08:59:59 -04:00
Hoan Luu Huu
14295dcebc support google s2s (#1169)
* support google s2s

* wip
2025-05-07 07:20:33 -04:00
Hoan Luu Huu
4d68c179ea sip_decline release callSession if ws requestor is used (#1182) 2025-05-06 10:01:36 -04:00
Hoan Luu Huu
6205959f53 fix microsoft stt max client buffer size error for transcribe verb (#1173) 2025-04-29 09:41:24 -04:00
Hoan Luu Huu
ed92cb2632 update speech utils 0.2.7 (#1177)
* update speech utils 0.2.7

* wip
2025-04-29 08:26:09 -04:00
Sam Machin
3098e04ed6 send env_vars in callHook (#1175)
* send env_vars in callHook

* lint

* add try/catch
2025-04-28 09:51:37 -04:00
Hoan Luu Huu
7e2fe72b6c fix say verb cannot failover if tts_response-code != 2xx (#1174) 2025-04-28 08:46:08 -04:00
Hoan Luu Huu
c2666b7a09 fixed deepgram gather cannot be timeout on empty transcription with continueAsr (#1171) 2025-04-28 08:36:31 -04:00
Hoan Luu Huu
9d54ca8116 Jambonz support Model context protocol (MCP) (#1150)
* Jambonz support Model context protocol (MCP)

* merged mcp tools with existing llmOptions.tools

* support list of mcp servers

* wip

* wip

* wip

* fix voice agent

* fix open-ai

* fix review comment

* fix deepgram voice agent

* update verb specification version
2025-04-24 06:50:53 -04:00
Sam Machin
472f4f4532 clientTools over webhooks (#1167)
* clientTools over webhooks

* lint

* simpler toolHook response
2025-04-23 09:15:16 -04:00
Hoan Luu Huu
63899d0091 update speech utils version 0.2.6 (#1172) 2025-04-23 08:22:47 -04:00
rammohan-y
31e6997746 Updated drachtio-srf version (#1170)
https://github.com/drachtio/drachtio-server/issues/424
2025-04-22 11:06:59 -04:00
Hoan Luu Huu
15b583ef2c only connect to drachtio server if connected to freeswitch (#1123)
* only connect to drachtio server if connected to freeswitch

* wip

* wip
2025-04-22 09:55:39 -04:00
Sam Machin
0bf2013934 add default model (#1147) 2025-04-22 09:49:24 -04:00
rammohan-y
182c310191 remove video from sdp in case of reInvite if the call is audio call (#1159)
https://github.com/jambonz/jambonz-feature-server/issues/1158
2025-04-18 09:33:16 -04:00
Sam Machin
4e74bab728 handle errors in createing call (#1164)
* move createCall into the try/catch and add a completionReason to results for errors

* add default completionReason

fixes #1165

* lint
2025-04-17 07:43:22 -04:00
rammohan-y
87195b6444 Stop tts streaming, when bargeIn is enabled for gather verb and input is detected (#1154)
https://github.com/jambonz/jambonz-feature-server/issues/1153
2025-04-14 09:18:50 -04:00
rammohan-y
eb5e6fa515 Updated db-helper to 0.9.11 (#1152)
https://github.com/jambonz/jambonz-feature-server/issues/1151
2025-04-14 08:22:28 -04:00
Dave Horton
305facb03b Fix/11labs no client config (#1149)
* update to verb specs

* add parameter to api call when there is not client config provided
2025-04-12 10:36:35 -04:00
Dave Horton
d310ba0ed1 reduce verbosity of logging (#1145) 2025-04-09 15:36:58 -04:00
Hoan Luu Huu
77f0fc85a3 ell tts support speech and pronunciation_dictionary_locators (#1137) 2025-04-09 12:32:06 -04:00
Sam Machin
c708b7d007 fix initial message format (#1144) 2025-04-09 10:43:07 -04:00
Hoan Luu Huu
343b382373 fixed ws-requestor missing hook for dial:confirm (#1143) 2025-04-09 07:29:08 -04:00
rammohan-y
0a541e089d Fix for https://github.com/jambonz/jambonz-feature-server/issues/1138 (#1139) 2025-04-04 09:02:18 -04:00
rammohan-y
d910981b1a Allow hangup verb on siprec call (#1136)
* Allow hangup verb on siprec call
https://github.com/jambonz/jambonz-feature-server/issues/1135

* added sip:decline to AllowedSipRecVerbs
2025-04-04 08:23:39 -04:00
Hoan Luu Huu
3f2744f032 fixed replaceEndpoint offer single codec that callee does not support (#1131) 2025-04-03 07:58:39 -04:00
Dave Horton
fcaf2e59e7 initial changes for openai stt (#1127)
* initial changes for openai stt

* wip

* wip

* wip

* wip

* wip

* make minBargeinWordCount work for openai

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wipp

* wip

* wip

* wip

* openai stt: support for prompt templates

* lint

* wip

* support openai semantic_vad

* wip

* transcribe supports openai stt

* sip

* wip

* wip

* refactor list of stt vendors that dont need to be restarted after a final transcript

* cleanup

* wip

* cleanup

* wip

* wip

* wip

* remove credentials from log

* comment
2025-03-28 13:14:58 -04:00
rammohan-y
ee846b283d Feat 1120 video call - remove video media from SDP if the call is audio call (#1124)
* sending jambonz:error when the incoming message is not parsable

https://github.com/jambonz/jambonz-feature-server/issues/1094

* writing an alert when incoming paylod is invalid

* added content to the jambonz:error payload

* removing video media from sdp if the call is an audio call. This is to avoid sending video media to destination if the incoming call is an audio call

* calling removeVideoSdp only when the environment variable JAMBONES_VIDEO_CALLS_ENABLED_IN_FS is set to true, this will ensure there are no regression issues for audio calls

* fixed jslint errors
2025-03-28 12:56:08 -04:00
Hoan Luu Huu
acdb8695a0 allow cartesia model_id is override from synthesizer option (#1130) 2025-03-27 13:37:57 -04:00
Hoan Luu Huu
f33f197e8d gather say support ttstream (#1128) 2025-03-27 07:19:19 -04:00
Sam Machin
9c437ab687 use deepgramOptions.model (#1126)
* use deepgramOptions.model

* lint

* Update transcription-utils.js
2025-03-24 12:25:29 -04:00
Dave Horton
1873694784 update to dractio-fsmrf@4.0.2 2025-03-17 08:50:10 -04:00
rammohan-y
d36e6b4c22 set the detected language as language_code when deepgram detects the language (#1116)
https://github.com/jambonz/jambonz-feature-server/issues/1115
2025-03-11 12:16:29 -04:00
rammohan-y
0470168757 updated realtimedb-helper to 0.8.13 (#1113) 2025-03-10 09:49:04 -04:00
Sam Machin
3120dbc3e0 Feature: add digitCount to amd-utils (#1111)
* add digitCount to amd-utils

* linting

* bump verb-specs
2025-03-06 12:01:51 -05:00
Hoan Luu Huu
8b8283e603 ws requestor should store initial sessionData when sending session:adulting (#1110) 2025-03-06 07:42:47 -05:00
Dave Horton
29de4b8878 fix crashing error with some media timeout scenarios (#1108) 2025-03-05 09:48:40 -05:00
Sam Machin
fa5fc1af9f allow transcribe_status update on Listen/Transcribe tasks (#1107) 2025-03-04 12:41:27 -05:00
Sam Machin
a5e778d7f3 call jambonzHangup when API ends call (#1104) 2025-03-03 07:23:03 -05:00
Dave Horton
bf4ae5b618 #1101 - allow listen url to have relative url and use base url of app… (#1102)
* #1101 - allow listen url to have relative url and use base url of application if ws

* remove logging
2025-02-28 14:19:45 -05:00
Sam Machin
ad2d99c417 if redirect to new server update requestor for baseURL (#1096) 2025-02-28 08:04:37 -05:00
Hoan Luu Huu
af4e17f447 fixed dial transcribe is not able to receive final transcribe when closing the call (#1073)
* fixed dial transcribe is not able to received final transcribe when close call.

* wip

* fix review comment

* support call session delay detroy ep when current task is transcribe

* wip

* wip

* fixed review comments

* fixed review comments
2025-02-27 07:25:01 -05:00
Hoan Luu Huu
cd2563ce17 support ultravox send user_input_message (#1100) 2025-02-26 19:50:09 -05:00
Sam Machin
af475cbea4 Update place-outdial.js (#1090)
* Update place-outdial.js

* update baseURL on redirect

* Revert "update baseURL on redirect"

This reverts commit 55778ba37edf029fa8687cd971b202af15478f95.
2025-02-25 15:09:21 -05:00
Anton Voylenko
69ba18acd1 Support sipindialog for conference (#1050)
* fix: add _onRequestWithinDialog catch block

* feat: support sipindialog for conference

* fix: remove any existing listener before adding new
2025-02-24 13:59:32 -05:00
rammohan-y
8bed44cce3 sending jambonz:error when the incoming message is not parsable (#1095)
* sending jambonz:error when the incoming message is not parsable

https://github.com/jambonz/jambonz-feature-server/issues/1094

* writing an alert when incoming paylod is invalid

* added content to the jambonz:error payload
2025-02-24 12:44:25 -05:00
Dave Horton
8ede41714b fix typo: change AWS_SNS_TOPIC_ARM to AWS_SNS_TOPIC_ARN (#1093) 2025-02-24 10:51:07 -05:00
Dave Horton
ee54e4341a update drachtio-srf 2025-02-20 10:17:53 -05:00
Hoan Luu Huu
4bf2f42f33 support ultravox sends createCall response to app (#1091)
* support ultravox sends createCall response to app

* update type issue

Co-authored-by: Matt Hertogs <matthertogs@gmail.com>

---------

Co-authored-by: Matt Hertogs <matthertogs@gmail.com>
2025-02-20 07:07:03 -05:00
Dave Horton
e09c763d3a #1088 ignore UtteranceEnd if we have unprocessed words (#1089)
* #1088 ignore UtteranceEnd if we have unprocessed words

* wip
2025-02-18 16:30:59 -05:00
Dave Horton
e8a7366526 handle exceptions if we invoke _lccCallHook with new url and it rejects for some reason (#1087) 2025-02-18 13:03:34 -05:00
Dave Horton
122d267816 better handling of flush commands (#1081)
* better handling of flush commands

* rework buffering of tokens

* gather: when returning low confidence also provide the transcript

* better error handling in tts:tokens

* special handling of asr timeout for speechmatics

* remove some logs that were excessively wordy
2025-02-18 09:31:11 -05:00
Hoan Luu Huu
33bca8e67c tts stream should save tts.response_time metric (#1086)
* tts stream should save tts.response_time metric

* wip
2025-02-18 08:45:21 -05:00
Hoan Luu Huu
9c05fd3deb fix dialMusic keep running in infinity loop (#1085) 2025-02-18 07:08:19 -05:00
Hoan Luu Huu
7fa0041f6b support deepgram options noDelay (#1083)
* support deepgram options noDelay

* update verb specification version
2025-02-15 16:39:30 -05:00
Hoan Luu Huu
59d9c62cbe support create call with target.proxy (#1075) 2025-02-11 09:24:04 -05:00
Dave Horton
55b408eecb add support for deepgram keyterms (#1071) 2025-02-07 13:12:25 -05:00
Hoan Luu Huu
f241faa871 update speech utils version (#1070) 2025-02-07 08:00:33 -05:00
rammohan-y
65d35c893c Feat/1067 set default language if language is undefined (#1068)
* sending recognition mode channel variable

* change verb-specifications version

* feat/1067 - setting default language to previously set language for the recognizer object if the vendor is default

* added undefined check for fallbackVendor and fallbackLanguage
2025-02-06 08:06:56 -05:00
Hoan Luu Huu
dbdc1cd43d support voxist stt (#1066)
* support voxist stt

* wip
2025-02-05 08:33:35 -05:00
Hoan Luu Huu
7105453d81 support caching tts audio with model/model_id (#1062)
* support caching tts audio with model/model_id

* update speech utils version
2025-02-03 08:47:44 -05:00
Hoan Luu Huu
8487a4be68 support elevenlabs private agent (#1063) 2025-02-02 22:10:51 -05:00
Hoan Luu Huu
2ddcd53d6b support elevenlabs s2s (#1052)
* support elevenlabs s2s

* wip

* wip

* wip
2025-02-02 10:29:48 -05:00
rammohan-y
a4d07ddce0 Feat/1057 recognition mode (#1060)
* sending recognition mode channel variable

* change verb-specifications version
2025-01-28 08:06:04 -05:00
rammohan-y
16e044cabf feat/1053: added empty check on this.currentTask (#1054) 2025-01-22 07:16:44 -05:00
Hoan Luu Huu
ba282d775d support rimelabs tts streaming (#1047) 2025-01-18 08:17:33 -05:00
Dave Horton
a194ba833e Feat/1041 (#1045)
* initial changes for stream synonym to listen

* listen on B endpoint if nested listen in dial has channel === 2
2025-01-17 08:48:39 -05:00
rammohan-y
77f3d9d7ec feat/1034: sending socket close code when there is no response from the websocket app (#1035) 2025-01-16 10:13:00 -05:00
Sam Machin
4dbc7df93d new error for HTTP responses without stack trace (#1044)
* new error for HTTP responses without stack trace

* lint
2025-01-16 08:05:17 -05:00
Dave Horton
f71f0ac69a Fix/speechmatics (#1042)
* add speechmatics options

* wip

* speechmatics does not do endpointing for us so we need to flip on continuousAsr

* speechmatics: continousAsr should be at least equal to max_delay, if set
2025-01-15 19:12:15 -05:00
Dave Horton
edb7e21ff9 update deps 2025-01-14 10:45:38 -05:00
Dave Horton
cafd9530a2 update drachtio-srf and fsmrf to main branch releases (#1038) 2025-01-14 10:01:33 -05:00
Hoan Luu Huu
ca8cace284 support custom tts streaming (#1023)
* support custom tts streaming

* wip

* wip

* wip

* wip

* wip

* wip

* fix review comments
2025-01-14 07:24:06 -05:00
Hoan Luu Huu
499c800213 Feat/ultravox s2s (#1032)
* support ultravox_s2s

* support ultravox_s2s

* support ultravox_s2s

* wip

* wip

* wip

* wip

* fix ultravox toolcall

* wip
2025-01-14 07:11:55 -05:00
Sam Machin
97952afb1d add deepgram filler words (#1036)
* add deepgram filler words

* Update package.json

* Update package-lock.json
2025-01-13 11:07:24 -05:00
Hoan Luu Huu
f4e68d0ea1 fix openai_s2s is using wrong model (#1031)
* fix openai_s2s is using wrong model

* wip

* wip
2025-01-11 08:38:14 -05:00
Dave Horton
6bad1a22f3 fix #1025 (#1026)
* fix #1025

* redirect verb should be able to redirect to a new websocket endpoint
2025-01-09 15:45:20 -05:00
Hoan Luu Huu
fcefa1ff31 fix inband dtmf does not work in dial verb (#1018) 2025-01-08 18:29:43 -05:00
Hoan Luu Huu
67cd53c930 rest:dial support timeLimit (#1024)
* rest:dial support timeLimit

* wip

* wip

* clear maxCallDuration timer
2025-01-07 12:21:09 -05:00
Dave Horton
a59784b8ab update base image to node:20-alpine (#1022) 2025-01-04 16:38:25 -05:00
Dave Horton
a2581eaeb4 tts throttling and send user_interruption event (#1019)
* tts throttling and send user_interruption event

* tts streaming: if we get a flush with tokens pending, send the flush after the tokens

* wip
2025-01-04 16:34:01 -05:00
Dave Horton
3706aa4d98 #1020 - fix for sticky bargein (#1021) 2025-01-03 10:41:35 -05:00
Dave Horton
25f1e65f63 feed TTS in sentence chunks when streaming (#1013)
* feed TTS in sentence chunks when streaming

* tts streaming: treat a paragraph as a chunk of text, even it not ending with a line end character

* wip
2024-12-31 15:16:25 -05:00
rammohan-y
c9f0481ca6 feat/1009, sending reason in X-Reason header when AHD processor giveup (#1014)
* feat/1009, sending reason in X-Reason header when AHD processor giveup is executed

* fixed jslint error

* added an alert
2024-12-31 15:09:23 -05:00
Hoan Luu Huu
564f6c9e55 support kill dial if sd ep is media timeout (#1001)
* support kill dial if sd ep is media timeout

* support kill dial if sd ep is media timeout

* support kill dial if sd ep is media timeout

* add media timeout reason header to bye message

* wip

* wip

* make configuration for freeswitch media timeout

* make configuration for freeswitch media timeout

* wip
2024-12-23 07:19:41 -05:00
Dave Horton
02f25f8343 fix cartesia channel vars for streaming (#1012) 2024-12-20 16:48:20 -05:00
Hoan Luu Huu
13ef89d605 support elevenlabs tts stream (#1011)
* support elevenlabs tts stream

* wip

* wip
2024-12-20 09:50:13 -05:00
Dave Horton
d05e470867 remove hardcoding of openai model 2024-12-19 18:42:57 -05:00
Hoan Luu Huu
17250f8386 support cartesia tts (#1008)
* support cartesia tts

* update speech util version

* update speech utils version
2024-12-19 07:35:47 -05:00
Dave Horton
ba3f46df64 Feat/tts streaming (#994)
* wip

* add TtsStreamingBuffer class to abstract handling of streaming tokens

* wip

* add throttling support

* support background ttsStream (#995)

* wip

* add TtsStreamingBuffer class to abstract handling of streaming tokens

* wip

* support background ttsStream

* wip

---------

Co-authored-by: Dave Horton <daveh@beachdognet.com>

* wip

* dont send if we have nothing to send

* initial testing with cartesia

* wip

---------

Co-authored-by: Hoan Luu Huu <110280845+xquanluu@users.noreply.github.com>
2024-12-18 14:44:37 -05:00
RJ Burnham
f37e1540ee Make voicemail hints case insensitive (#1007) 2024-12-13 13:42:29 -05:00
Dave Horton
5e04db82bf Feat/deepgram voice agent (#1006)
* wip

* wip

* wip
2024-12-13 10:05:23 -05:00
Dave Horton
0aa37a83ae Feat/handle 3pcc invite (#1005)
* wip

* wip

* linting
2024-12-12 18:39:15 -05:00
Hoan Luu Huu
c29ab0d858 support referBy display name (#1000)
* support referBy display name

* wip

* update verb specification
2024-12-11 12:46:29 -05:00
Sam Machin
71d4c90cbc catch error (#1002)
* catch error

* remove notifyTaskDone
2024-12-11 12:34:44 -05:00
Hoan Luu Huu
a929a649f9 fix ConfirmCallSession cannot be played (#993)
* fix ConfirmCallSession cannot be played

* fix review comments

* fix review comments
2024-12-10 19:36:42 -05:00
Dave Horton
3bb4f1a29f fix #998 incorrectly sending final transcript with is_final=false (#999) 2024-12-10 18:48:02 -05:00
Hoan Luu Huu
54cc76606b fix cannot replace endpoint for adulting session (#992)
* fix cannot replace endpoint for adulting session

* fix cannot replace endpoint for adulting session
2024-12-06 07:51:24 -05:00
rammohan-y
0458bb7d6c Feat/884: Capture system_alert when feature-server is online or offline (#950)
* writing alerts during startup and shutdown of feature-server

* feat/884: created constants for system component name and state

* feat/88: added 0.2.11 version of time-series

* feat/884: renamed constant, and added GracefulShutdownInProgress system alert
2024-12-05 09:23:03 -05:00
Sam Machin
dce4fe1f82 Fix/986 (#990)
* throw new NonFatalTask error on play file not found

* linting

* make SpeechCredentialError subclass of NonFatalTask error

* cleanup

* Update action-hook-delay.js

* bump fsmrf version

* linting and package-lock

* Update package-lock.json

* update error

* only throw on fs error "File not found"

* add alert

* update time-series dep

* Update package-lock.json

* linting

* Update play.js

* remove stack trace from error message

* fix error formatting
2024-12-04 05:47:49 -05:00
Hoan Luu Huu
e96c35d571 fixed iamrole from sessionToken to securityToken (#988)
* fixed iamrole from sessionToken to securityToken

* wip

* support get aws credential from instance profile
2024-11-29 21:58:42 -05:00
Hoan Luu Huu
070671a3fb support send refer custom header to referhook (#981) 2024-11-28 08:34:34 -05:00
rammohan-y
efdb56f0a0 feat/971 - fix to allow hints objects array (#987) 2024-11-28 07:25:10 -05:00
Hoan Luu Huu
e2edbb4a5b support enable dtmf tone (#976)
* support enable dtmf tone

* wip

* wip
2024-11-26 20:25:48 -05:00
Markus Frindt
3a6d63a1c6 Fix the issue for outbound calls that always the None credentials wer… (#984)
* Fix the issue for outbound calls that always the None credentials were used. session:new for rest dial did not contain recognizer.label and synthesizer.label

* update comment

---------

Co-authored-by: mfrindt <m.frindt@cognigy.com>
2024-11-26 10:26:20 -05:00
rammohan-y
c874ab8100 feat/975: fixed continuous asr not stopping when asrDtmfTerminationDi… (#977)
* feat/975: fixed continuous asr not stopping when asrDtmfTerminationDigit is configured

* feat/975: giving first preference to asrDtmfTerminationDigit if there is already ASR happened
2024-11-26 08:23:11 -05:00
Dave Horton
24a66fed64 wip (#979) 2024-11-19 09:37:00 -05:00
Hoan Luu Huu
c8c3738ae8 custom stt vendor ws connection should not be closed in asrTimeout (#973) 2024-11-18 10:17:31 -05:00
Dave Horton
c1330d4651 fix transcribe fixes for speechmatics (#978)
* fix transcribe fixes for speechmatics

* update to verb-specs with fixes for speechmatics

* add support for speechmatics translation

* add handlers for receiving translations

* call translation hookd

* gather: no need to restart speechmatics after a final transcript during continuous asr

* graceful shutdown

* wip

* wip

* wip

* wip

* wip
2024-11-16 10:21:04 -05:00
Hoan Luu Huu
27f3a4b520 support SIP Privacy (#970) 2024-11-15 07:11:47 -05:00
Hoan Luu Huu
594c867192 unbridge dial ep with caller ep to avoid media release when referHook (#972) 2024-11-14 19:30:49 -05:00
Hoan Luu Huu
71c475e758 allow dub as http updateCall request (#974) 2024-11-14 07:20:33 -05:00
RJ Burnham
22ef201360 Add support to export to more than one otel platform. (#969)
* Add support to export to more than one otel platform.

This is helpful for if you want to keep using the bundled jaeger
support in the web console AND send to external OTLP based platform
(such as Axiom.co!).

* Lint issues and cleanup.
2024-11-13 10:25:02 -05:00
Hoan Luu Huu
5be3a910ad fix google custom voice can not be used without voice cloning key (#968) 2024-11-11 07:24:40 -05:00
Dave Horton
7615509e0b update test to use drachtio/drachtio-freeswitch-mrf:0.9.2-4 with aws_transcribe_ws fix (#964) 2024-11-08 09:52:26 -05:00
Dave Horton
851c071345 fix for #962 (#963) 2024-11-08 07:12:08 -05:00
rammohan-y
7911459c8c feat/940 stopped calling updateSpeechCredentialLastUsed (#944) 2024-11-05 15:19:08 -05:00
Hoan Luu Huu
be258950b0 feature server should send USER call to the sbc sip that is connect with the user (#949)
* feature server should send USER call to the sbc sip that is connect with the user

* feature server should send USER call to the sbc sip that is connect with the user

* feature server should send USER call to the sbc sip that is connect with the user

* fix review comment

* add env variable to enable the feature

* add env variable to enable the feature

* add env variable to enable the feature

* minor test update

---------

Co-authored-by: Dave Horton <daveh@beachdognet.com>
2024-11-05 15:14:04 -05:00
Hoan Luu Huu
0520386a1e fixed dial verb should use calling id from From header (#958)
* fixed dial verb should use calling id from From header

* fix review comment

* wip
2024-11-05 13:48:35 -05:00
Hoan Luu Huu
a4b1b22324 update speech utils version (#957) 2024-11-04 08:04:19 -05:00
Hoan Luu Huu
e800cca961 support google voice cloning (#956)
* support google voice cloning

* wip
2024-11-04 07:10:52 -05:00
Dave Horton
1efb198f72 Dial: fix error when receiving a REFER without a Referred-By header (#954) 2024-10-30 13:01:36 -04:00
rammohan-y
4b5df855e1 feat/952: removed unnecessary condition which is not logging the target_sid (#953) 2024-10-29 07:34:49 -04:00
Hoan Luu Huu
24126ef1ec fixed feature server kill currenttask if jambonz hangup the call (#948) 2024-10-26 10:21:16 -04:00
Dave Horton
8e4995ec02 fix bug where middleware produces a cached app.tasks with an empty array (#947) 2024-10-24 20:43:27 -04:00
Dave Horton
a005253a9f update to latest speech-utils 2024-10-18 12:27:29 -04:00
rammohan-y
10efc5d608 feat/942: updated optimal google models (#943) 2024-10-18 10:03:56 -04:00
Hoan Luu Huu
1c48c40496 Support sip_parent_callid for sbc-outbound (#939)
* include X-CID for dial outbound if the call-session is outbound

* include X-CID for dial outbound if the call-session is outbound

* include X-CID for dial outbound if the call-session is outbound

* include X-CID for dial outbound if the call-session is outbound
2024-10-17 07:18:58 -04:00
Dave Horton
c79a6aaf8a Feat/llm update (#936)
* add support for llm:update during LLM session

* make sure to end openai session when Llm task is killed

* wip

* wip

* wip

* wip

* wip

* wip

* wip
2024-10-16 09:27:51 -04:00
Hoan Luu Huu
da5f51e8e0 update speech utils version (#937) 2024-10-16 08:26:06 -04:00
Hoan Luu Huu
e7fd40e297 support sbcCallId in calling/status hook (#934)
* support sbcCallId in calling/status hook

* support sbcCallId in calling/status hook

* support sbcCallId in calling/status hook

* wip

* wip

* wip
2024-10-14 18:00:09 -04:00
Dave Horton
f541ff1a15 add support for aws language model name when transcribing (#890)
* add support for aws language model name when transcribing

* wip - from prev branch

* wip

* support both aws grpc and ws api - detect based on transcription payload

* update to drachtio-fsmrf@4.0.0

* fix for grpc compatibility, requires JAMBONES_AWS_TRANSCRIBE_USE_GRPC env

* back out major update to drachtio-srf and fsmrf; that should come in a separate PR

* update drachtio-srf
2024-10-12 19:46:31 -04:00
Dave Horton
98b968d61f update test db (#933) 2024-10-12 19:34:40 -04:00
Dave Horton
f09722a5b5 Feat/llm verb (#931)
* wip

* working version for openai realtime beta

* lint

* tests: update db to latest 0.9.2 schema
2024-10-12 19:26:27 -04:00
Dave Horton
f84b3793e1 Feat/speechmatics (#932)
* wip

* initial working version of speechmatics

* linting
2024-10-12 18:42:53 -04:00
Dave Horton
84b7456c2d add support for speechmatics asr (#920)
* update to verb specs with speechmatics support

* discover local ip using os module
2024-10-11 09:24:36 -04:00
Hoan Luu Huu
c67499e38b update speech version 0.1.18 (#930) 2024-10-11 08:59:33 -04:00
Hoan Luu Huu
e372a3cdfb update speech version (#927) 2024-10-09 19:46:44 -04:00
rammohan-y
ea303caa1c feat/924: made actions as optional when there is no noResponseTimeout (#925) 2024-10-08 08:06:12 -04:00
Hoan Luu Huu
2af67d8f05 support changing log level runtime (#926)
* support changing log level runtime

* support changing log level runtime

* support changing log level runtime
2024-10-07 09:51:51 -04:00
Hoan Luu Huu
96b3b0fe07 Allow Say, Gather, Transcribe is able to finished if there is error for speech credential (#910)
* allow move to next task if say verb is failed because of speech credential

* allow move to next task if say verb is failed because of speech credential

* allow move to next task if say verb is failed because of speech credential

* wip

* wip
2024-10-01 13:40:41 -04:00
Hoan Luu Huu
b898b70520 support config referHook (#915) 2024-09-30 08:13:48 -04:00
Hoan Luu Huu
b9ef00dfc7 Fixed Gather digits does not work without nested say/play (#914)
* Fixed Gather digits does not work without nested say/play

* fix review comment

* add assert to make sure we don't register dtmf twice in gather verb
2024-09-30 07:46:21 -04:00
rammohan-y
68fa3c013d Feat/902: executing giveUpAction when noResponseGiveupTimeout is reached (#908)
* feat/893: made noResponseTimeout and noResponseGiveUpTimout independent

* support for giveUpActions implemented

* feat/902: using makeTask and exec of task to execute the giveUpActions

* feat/902: changed version of verb-specifications and speech-utils

* feat/902: fixed jslint errors

* feat/902: modified log

* feat/902: using a new event giveupWithTasks for processing giveUpActions

* feat/902: removed check for wakeupResolver and replaceApplication already taking care of wakeupResolver, also updated the verb-specifications version

* feat/902: removed sync for _onNoResponseGiveUpTimer function
2024-09-26 09:40:30 -04:00
Dave Horton
7c24208067 fix #916: race condition where call just ended when action hook play completes (#917) 2024-09-25 20:17:22 -04:00
Dave Horton
7f7c26e982 fix for https://github.com/jambonz/freeswitch-modules/issues/117 (#912) 2024-09-25 20:13:56 -04:00
Markus Frindt
402adc2098 add label to tts stt spans (#909)
Co-authored-by: Markus Frindt <m.frindt@cognigy.com>
2024-09-25 16:44:15 -04:00
rammohan-y
724d4fb713 Feat/893: Made noResponseTimeout and noResponseGiveUpTimeout independent (#896)
* feat/893: made noResponseTimeout and noResponseGiveUpTimout independent

* feat/893: not assuming 0 if noResponseTimeout is not specified
2024-09-19 10:29:51 -04:00
Hoan Luu Huu
673827cce3 fixed adulting call session does not send status callback if hangup is used (#907) 2024-09-19 10:08:15 -04:00
Anton Voylenko
c4c5ad33d8 feat: loop dial music (#769) 2024-09-17 13:51:01 -04:00
rammohan-y
7bbfc01cb0 feat/864: enable bargeIn when minBargeinWordCount is 0 (#900)
* feat/864: checking for undefined, because 0 is a valid value for minBargeinWordCount

* feat/864: checking for undefined and null

* feat/864: corrected spelling of mode and added check for undefined as 0 is a valid value for vad.mode
2024-09-17 13:26:29 -04:00
Hoan Luu Huu
7daf056d6b allow set vendor model or engine in runtime (#897) 2024-09-12 09:03:15 +01:00
Hoan Luu Huu
e69afc4be4 fix recognizer/synthesizer label wrongly select between verb and app (#881)
* fix recognizer/synthesizer label wrongly select between verb and application

* fix jslint

* fix ASR cannot fallback

* update tts fallback does not send notification
2024-09-11 09:34:52 +01:00
rammohan-y
3a7cc27d0a feat/891: getting the options from customOptions in case of custom stt (#892) 2024-09-10 09:38:33 +01:00
Dave Horton
c4a6057fc6 bump version 2024-09-04 13:31:05 +01:00
rammohan-y
174438bb01 Feat/882: default model setting for en-IN language (#888)
* feat/882: default model setting for en-IN language

* feat/882: refactored if into ||
2024-09-03 13:22:38 +01:00
Antony Jukes
4348615b75 Create Call Rest is missing target headers on outdial (#874)
* add target headers for rest create-call

* rebased

---------

Co-authored-by: ajukes <ajukes@callable.io>
2024-09-02 21:48:09 +01:00
Hoan Luu Huu
d365883bfe fix #883 that after kicked from conference, no long receive freeswitch CUSTOM event (#886)
* fix #883 that after kicked from conference, no long receive freeswitch CUSTOM event

* fix #883 that after kicked from conference, no long receive freeswitch CUSTOM event

* reset Esl Custom event after conference.

* update drachtio fsmrf version
2024-08-31 14:47:39 +01:00
Dave Horton
c0ab936b76 wip (#830)
* wip

* wip

* wip

* wip

* update deps

* update test to use latest freeswitch image
2024-08-29 15:23:49 -04:00
Dave Horton
600ff763fa fix #840 (#880) 2024-08-26 10:14:59 -04:00
Hoan Luu Huu
4d077e990f Fix/audio issue kick conference (#878)
* rest call session does not handle for RE-INVITE

* fixed audio is bad after kicked from conference

* fix review comment
2024-08-23 09:28:39 -04:00
RJ Burnham
eccef54b04 Add support for configuring the IP address that is advertised to the API server. (#875)
This is needed when running in fargate as ip.address() will return the wrong ip address.
2024-08-23 08:33:16 -04:00
Dave Horton
2790e6d9ad fix linting error from PR 2024-08-20 08:36:24 -04:00
rammohan-y
f95d8639be Feat/868: Use global synthesizer config properties for say verb (#869)
* feat/868: Use the properties from global config in verb for TTS

* feat/868: setting this.options to combination of cs.synthesizer.options and this.options

* feat/868: Move the logic of copying cs properties to parent class tts-task.js

* feat/868: add empty line that was removed, say.js restored to original version

* feat/868: moved _synthesizeWithSpecificVendor to tts-task.js

---------

Co-authored-by: Rammohan Yadavalli <rammohan.yadavalli@kore.com>
2024-08-20 08:31:44 -04:00
Hoan Luu Huu
fc838512b6 Fixed long amd hints make freeswitch module cannot connect the vendor (#872)
* rest call session does not handle for RE-INVITE

* fixed long amd hints make freeswitch module cannot connect the vendor
2024-08-20 07:30:32 -04:00
Dave Horton
68992bccf6 fix #866 (#867) 2024-08-16 14:54:22 -04:00
Anton Voylenko
c131fceea7 fix: misleading log on call creation (#865) 2024-08-15 09:15:08 -04:00
Hoan Luu Huu
12174359f2 fix support precache audio with tts stream (#855)
* fix support precache audio with tts stream

* update speech util
2024-08-15 08:22:00 -04:00
Hoan Luu Huu
020c84d2df rest call session does not handle for RE-INVITE (#863) 2024-08-14 07:19:00 -04:00
Hoan Luu Huu
62d71d2504 fix conference in cluster have correct direction in callInfo (#842)
* fix conference in cluster have correct direction

* update github action
2024-08-13 20:02:50 -04:00
Anton Voylenko
c594797cb0 fix: support _lccMuteStatus for conference (#853) 2024-08-13 09:57:26 -04:00
Anton Voylenko
bae96a6752 fix: do not run snake case for customer data (#861) 2024-08-13 09:45:58 -04:00
rammohan-kore
ee68575ea4 Feat/844 Sending callSid in the custom-stt start message (#848)
* https://github.com/jambonz/jambonz-feature-server/issues/844

sending callSid in options, so that the callSid is sent to stt websocket in case of custom websocket

* feat/844: checking for existance of task.cs.callSid

* feat/844: changed the condition to task.cs?.callSid
2024-08-13 09:28:19 -04:00
rammohan-kore
6d0aeff6e2 feat/859: updated verb-specifications to 0.0.76 (#860) 2024-08-13 08:56:56 -04:00
rammohan-kore
d2a5d483d0 feat/856: sending DEEPGRAM_SPEECH_MODEL_VERSION to deepgram (#858) 2024-08-12 09:34:23 -04:00
Hoan Luu Huu
d3eb106d5d clear gather timeout if imterim result received (#800)
* clear gather timeout if imterim result received

* fix to reset timeout timer if receive interrim result

* fix to reset timeout timer if receive interrim result
2024-08-08 07:50:46 -04:00
Hoan Luu Huu
689e55bdf0 support wait hook for conf:participant-action hold (#851) 2024-08-08 07:42:11 -04:00
Hoan Luu Huu
ed7e036890 support jambonz transcribe sampling rate (#847)
* support jambonz transcribe sampling rate

* fix review comment

* update verb specification version
2024-08-07 10:39:58 -04:00
Hoan Luu Huu
f90fcdf57b Feat/deepgrap tts onprem (#846)
* support deepgram tts onprem

* upodate speech utils version
2024-08-07 07:25:28 -04:00
rammohan-kore
c2a1819cbb feat/813 - notify speech-bargein-detected and dtmf-bargein-detected events (#823)
* feat/813 - notify speech-bargein-detected and dtmf-bargein-detected events

* fix for #826 race condition in say (#827)

* fix for #826 race condition in say

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* wip

* Update transcription-utils.js (#802)

* Check the confidence levels of a transcript  with minConfidence (#808)

* https://github.com/jambonz/jambonz-feature-server/issues/807

* feat/807: Using minConfidence from recognizer settings

* feat/807: new reason stt-min-confidence-error

* feat/807: sending stt-min-confidence instead of  stt-min-confidence-error

* feat/807: sending stt-low-confidence instead of  stt-min-confidence-error

* feat/807 - removed ? for this.data

* fix conference end is not sent when moderator leave conference (#825)

* fix conference end is not sent when moderator leave conference

* wip

* fix review comment

* feat/813: checking for playComplete before sending dtmf-bargein-detected event

* feat/813: added this.playComplete=true at the end of _killAudio method

* feat/813: removed empty line

* feat/813: removed nested if and added condition to main if

* feat/813: notifyStatus called when not playComplete

* feat/813: referring to time-series 0.2.9 version

* feat/813: generated package-lock.json

---------

Co-authored-by: Dave Horton <daveh@beachdognet.com>
Co-authored-by: Vinod Dharashive <vdharashive@gmail.com>
Co-authored-by: Hoan Luu Huu <110280845+xquanluu@users.noreply.github.com>
2024-08-06 09:34:15 -04:00
rammohan-kore
4259a24fa0 feat/758: for google getting the language_code from evt (#843) 2024-08-06 07:45:08 -04:00
rammohan-kore
e4e37d5697 feat/836: capturing callSid for STT and TTS alerts (#838)
* feat/836: capturing callSid for STT and TTS alerts

* feat/836: corrected assignment of callSid and added target_sid at few more alerts

* update github action

---------

Co-authored-by: Quan HL <quan.luuhoang8@gmail.com>
2024-08-05 12:14:08 -04:00
Markus Frindt
b7a3c2970a Bug/fix missing arg reconnect alert (#835)
* Add url as argument to a webhook connection failure alert after reconnect error

* npm audit fix to remove 15 high vulnerabilities

---------

Co-authored-by: Markus Frindt <m.frindt@cognigy.com>
2024-07-31 09:25:31 -04:00
Hoan Luu Huu
cc33ac1d51 fix conference end is not sent when moderator leave conference (#825)
* fix conference end is not sent when moderator leave conference

* wip

* fix review comment
2024-07-30 07:32:07 -04:00
89 changed files with 12500 additions and 12563 deletions

View File

@@ -12,6 +12,11 @@ jobs:
node-version: 20
- run: npm ci
- run: npm run jslint
- name: Install Docker Compose
run: |
sudo curl -L "https://github.com/docker/compose/releases/download/1.29.2/docker-compose-$(uname -s)-$(uname -m)" -o /usr/local/bin/docker-compose
sudo chmod +x /usr/local/bin/docker-compose
docker-compose --version
- run: docker pull drachtio/sipp
- run: npm test
env:

6
.gitignore vendored
View File

@@ -2,6 +2,9 @@
logs
*.log
.claude/
CLAUDE.md
# Runtime data
pids
*.pid
@@ -42,4 +45,5 @@ ecosystem.config.js
test/credentials/*.json
run-tests.sh
run-coverage.sh
.vscode
.vscode
.env

View File

@@ -1,4 +1,4 @@
FROM --platform=linux/amd64 node:18.15-alpine3.16 as base
FROM --platform=linux/amd64 node:20-alpine as base
RUN apk --update --no-cache add --virtual .builds-deps build-base python3

View File

@@ -13,7 +13,7 @@ Configuration is provided via environment variables:
|AWS_ACCESS_KEY_ID| aws access key id, used for TTS/STT as well SNS notifications|no|
|AWS_REGION| aws region| no|
|AWS_SECRET_ACCESS_KEY| aws secret access key, used per above|no|
|AWS_SNS_TOPIC_ARM| aws sns topic arn that scale-in lifecycle notifications will be published to|no|
|AWS_SNS_TOPIC_ARN| aws sns topic arn that scale-in lifecycle notifications will be published to|no|
|DRACHTIO_HOST| ip address of drachtio server (typically '127.0.0.1')|yes|
|DRACHTIO_PORT| listening port of drachtio server for control connections (typically 9022)|yes|
|DRACHTIO_SECRET| shared secret|yes|
@@ -21,6 +21,7 @@ Configuration is provided via environment variables:
|ENCRYPTION_SECRET| secret for credential encryption(JWT_SECRET is deprecated) |yes|
|GOOGLE_APPLICATION_CREDENTIALS| path to gcp service key file|yes|
|HTTP_PORT| tcp port to listen on for API requests from jambonz-api-server|yes|
|HTTP_IP| IP Address for API requests from jambonz-api-server |no|
|JAMBONES_GATHER_EARLY_HINTS_MATCH| if true and hints are provided, gather will opportunistically review interim transcripts if possible to reduce ASR latency |no|
|JAMBONES_FREESWITCH| IP:port:secret for Freeswitch server (e.g. '127.0.0.1:8021:JambonzR0ck$'|yes|
|JAMBONES_LOGLEVEL| log level for application, 'info' or 'debug'|no|
@@ -71,7 +72,7 @@ module.exports = {
STATS_PORT: 8125,
STATS_PROTOCOL: 'tcp',
STATS_TELEGRAF: 1,
AWS_SNS_TOPIC_ARM: 'arn:aws:sns:us-west-1:xxxxxxxxxxx:terraform-20201107200347128600000002',
AWS_SNS_TOPIC_ARN: 'arn:aws:sns:us-west-1:xxxxxxxxxxx:terraform-20201107200347128600000002',
JAMBONES_NETWORK_CIDR: '172.31.0.0/16',
JAMBONES_MYSQL_HOST: 'aurora-cluster-jambonz.cluster-yyyyyyyyyyy.us-west-1.rds.amazonaws.com',
JAMBONES_MYSQL_USER: 'admin',

143
app.js
View File

@@ -25,9 +25,80 @@ const opts = {
};
const pino = require('pino');
const logger = pino(opts, pino.destination({sync: false}));
const {LifeCycleEvents, FS_UUID_SET_NAME} = require('./lib/utils/constants');
const {LifeCycleEvents, FS_UUID_SET_NAME, SystemState, FEATURE_SERVER} = require('./lib/utils/constants');
const installSrfLocals = require('./lib/utils/install-srf-locals');
installSrfLocals(srf, logger);
const createHttpListener = require('./lib/utils/http-listener');
const healthCheck = require('@jambonz/http-health-check');
const ProcessMonitor = require('./lib/utils/process-monitor');
const monitor = new ProcessMonitor(logger);
// Log startup
monitor.logStartup();
monitor.setupSignalHandlers();
logger.on('level-change', (lvl, _val, prevLvl, _prevVal, instance) => {
if (logger !== instance) {
return;
}
logger.info('system log level %s was changed to %s', prevLvl, lvl);
});
// Install the srf locals
installSrfLocals(srf, logger, {
onFreeswitchConnect: (wraper) => {
// Only connect to drachtio if freeswitch is connected
logger.info(`connected to freeswitch at ${wraper.ms.address}, start drachtio server`);
if (DRACHTIO_HOST) {
srf.connect({host: DRACHTIO_HOST, port: DRACHTIO_PORT, secret: DRACHTIO_SECRET });
srf.on('connect', (err, hp) => {
const arr = /^(.*)\/(.*)$/.exec(hp.split(',').pop());
srf.locals.localSipAddress = `${arr[2]}`;
logger.info(`connected to drachtio listening on ${hp}, local sip address is ${srf.locals.localSipAddress}`);
});
}
else {
logger.info(`listening for drachtio requests on port ${DRACHTIO_PORT}`);
srf.listen({port: DRACHTIO_PORT, secret: DRACHTIO_SECRET});
}
// Start Http server
createHttpListener(logger, srf)
.then(({server, app}) => {
httpServer = server;
healthCheck({app, logger, path: '/', fn: getCount});
return {server, app};
})
.catch((err) => {
logger.error(err, 'Error creating http listener');
});
},
onFreeswitchDisconnect: (wraper) => {
// check if all freeswitch connections are lost, disconnect drachtio server
logger.info(`lost connection to freeswitch at ${wraper.ms.address}`);
const ms = srf.locals.getFreeswitch();
if (!ms) {
logger.info('no freeswitch connections, stopping drachtio server');
disconnect();
}
}
});
if (NODE_ENV === 'test') {
srf.on('error', (err) => {
logger.info(err, 'Error connecting to drachtio');
});
}
// Init services
const writeSystemAlerts = srf.locals?.writeSystemAlerts;
if (writeSystemAlerts) {
writeSystemAlerts({
system_component: FEATURE_SERVER,
state : SystemState.Online,
fields : {
detail: `feature-server with process_id ${process.pid} started`,
host: srf.locals?.ipv4
}
});
}
const {
initLocals,
@@ -42,24 +113,6 @@ const {
const InboundCallSession = require('./lib/session/inbound-call-session');
const SipRecCallSession = require('./lib/session/siprec-call-session');
if (DRACHTIO_HOST) {
srf.connect({host: DRACHTIO_HOST, port: DRACHTIO_PORT, secret: DRACHTIO_SECRET });
srf.on('connect', (err, hp) => {
const arr = /^(.*)\/(.*)$/.exec(hp.split(',').pop());
srf.locals.localSipAddress = `${arr[2]}`;
logger.info(`connected to drachtio listening on ${hp}, local sip address is ${srf.locals.localSipAddress}`);
});
}
else {
logger.info(`listening for drachtio requests on port ${DRACHTIO_PORT}`);
srf.listen({port: DRACHTIO_PORT, secret: DRACHTIO_SECRET});
}
if (NODE_ENV === 'test') {
srf.on('error', (err) => {
logger.info(err, 'Error connecting to drachtio');
});
}
srf.use('invite', [
initLocals,
createRootSpan,
@@ -85,23 +138,28 @@ sessionTracker.on('idle', () => {
}
});
const getCount = () => sessionTracker.count;
const healthCheck = require('@jambonz/http-health-check');
let httpServer;
const createHttpListener = require('./lib/utils/http-listener');
createHttpListener(logger, srf)
.then(({server, app}) => {
httpServer = server;
healthCheck({app, logger, path: '/', fn: getCount});
return {server, app};
})
.catch((err) => {
logger.error(err, 'Error creating http listener');
});
setInterval(() => {
const monInterval = setInterval(async() => {
srf.locals.stats.gauge('fs.sip.calls.count', sessionTracker.count);
try {
const systemInformation = await srf.locals.dbHelpers.lookupSystemInformation();
if (systemInformation && systemInformation.log_level) {
const envLogLevel = logger.levels.values[JAMBONES_LOGLEVEL.toLowerCase()];
const dbLogLevel = logger.levels.values[systemInformation.log_level];
const appliedLogLevel = Math.min(envLogLevel, dbLogLevel);
if (logger.levelVal !== appliedLogLevel) {
logger.level = logger.levels.labels[Math.min(envLogLevel, dbLogLevel)];
}
}
} catch (err) {
if (process.env.NODE_ENV === 'test') {
clearInterval(monInterval);
logger.error('all tests complete');
}
else logger.error({err}, 'Error checking system log level in database');
}
}, 20000);
const disconnect = () => {
@@ -109,16 +167,29 @@ const disconnect = () => {
httpServer?.on('close', resolve);
httpServer?.close();
srf.disconnect();
srf.removeAllListeners();
srf.locals.mediaservers?.forEach((ms) => ms.disconnect());
});
};
process.on('SIGTERM', handle);
process.on('SIGINT', handle);
function handle(signal) {
async function handle(signal) {
const {removeFromSet} = srf.locals.dbHelpers;
srf.locals.disabled = true;
logger.info(`got signal ${signal}`);
const writeSystemAlerts = srf.locals?.writeSystemAlerts;
if (writeSystemAlerts) {
// it has to be synchronous call, or else by the time system saves the app terminates
await writeSystemAlerts({
system_component: FEATURE_SERVER,
state : SystemState.Offline,
fields : {
detail: `feature-server with process_id ${process.pid} stopped, signal ${signal}`,
host: srf.locals?.ipv4
}
});
}
const setName = `${(JAMBONES_CLUSTER_ID || 'default')}:active-fs`;
const fsServiceUrlSetName = `${(JAMBONES_CLUSTER_ID || 'default')}:fs-service-url`;
if (setName && srf.locals.localSipAddress) {

View File

@@ -163,5 +163,72 @@
"wird sich bei Ihnen melden",
"ich melde mich bei dir",
"wir können nicht"
],
"it-IT": [
"segreteria telefonica",
"risponde la segreteria telefonica",
"lascia un messaggio",
"puoi lasciare un messaggio dopo il segnale",
"dopo il segnale acustico",
"il numero chiamato non è raggiungibile",
"non è raggiungibile",
"lascia pure un messaggio",
"puoi lasciare un messaggio"
],
"ja-JP": [
"この通話は留守番電話に転送されました",
"発信先は現在電話に出ることができません",
"発信音の後でメッセージを録音してください",
"録音を完了したら電話を切ることができます",
"只今電話に出ることができません",
"ただ今電話に出ることができません",
"ただいま電話に出ることができません",
"ピーという発信音の後にお名前とご用件をお話しください",
"ファックスを送られる方はスタートボタンを押してください",
"FAXを送られる方はスタートボタンを押してください",
"おかけになった電話をお呼びしましたが",
"お出になりません",
"おでになりません",
"お掛けになった電話番号は",
"おかけになった電話番号は",
"お掛けになった電話は",
"おかけになった電話は",
"現在使われておりません",
"番号をお確かめになって",
"お掛け直し下さい",
"おかけ直し下さい",
"おかけ直しください",
"こちらはNTTドコモです",
"こちらはエーユーです",
"こちらはソフトバンクです",
"電波の届かない",
"電源が入っていない",
"掛かりません",
"かかりません",
"お繋ぎすることが出来ません",
"お繋ぎ出来ません",
"お繋ぎすることができません",
"お繋ぎできません",
"おつなぎすることができません",
"おつなぎできません",
"メッセージを録音",
"留守番電話",
"お留守番サービス",
"留守番",
"留守電",
"留守",
"接続します",
"合図の音",
"ピーと",
"発信音",
"ご用件",
"伝言",
"お話しください",
"ファックス",
"FAX",
"終了",
"終了しました",
"終了いたしました",
"営業時間"
]
}

View File

@@ -73,6 +73,7 @@ const JAMBONES_LOGLEVEL = process.env.JAMBONES_LOGLEVEL || 'info';
const JAMBONES_INJECT_CONTENT = process.env.JAMBONES_INJECT_CONTENT;
const PORT = parseInt(process.env.HTTP_PORT, 10) || 3000;
const HTTP_IP = process.env.HTTP_IP;
const HTTP_PORT_MAX = parseInt(process.env.HTTP_PORT_MAX, 10);
const K8S = process.env.K8S;
@@ -92,7 +93,7 @@ const AWS_REGION = process.env.AWS_REGION;
const AWS_ACCESS_KEY_ID = process.env.AWS_ACCESS_KEY_ID;
const AWS_SECRET_ACCESS_KEY = process.env.AWS_SECRET_ACCESS_KEY;
const AWS_SNS_PORT = parseInt(process.env.AWS_SNS_PORT, 10) || 3001;
const AWS_SNS_TOPIC_ARM = process.env.AWS_SNS_TOPIC_ARM;
const AWS_SNS_TOPIC_ARN = process.env.AWS_SNS_TOPIC_ARN;
const AWS_SNS_PORT_MAX = parseInt(process.env.AWS_SNS_PORT_MAX, 10) || 3005;
const GCP_JSON_KEY = process.env.GCP_JSON_KEY;
@@ -107,6 +108,8 @@ const DEEPGRAM_API_KEY = process.env.DEEPGRAM_API_KEY;
const ANCHOR_MEDIA_ALWAYS = process.env.ANCHOR_MEDIA_ALWAYS;
const VMD_HINTS_FILE = process.env.VMD_HINTS_FILE;
const JAMBONES_AWS_TRANSCRIBE_USE_GRPC = process.env.JAMBONES_AWS_TRANSCRIBE_USE_GRPC;
/* security, secrets */
const LEGACY_CRYPTO = !!process.env.LEGACY_CRYPTO;
const JWT_SECRET = process.env.JWT_SECRET;
@@ -127,12 +130,18 @@ const OPTIONS_PING_INTERVAL = parseInt(process.env.OPTIONS_PING_INTERVAL, 10) ||
const JAMBONZ_RECORD_WS_BASE_URL = process.env.JAMBONZ_RECORD_WS_BASE_URL || process.env.JAMBONES_RECORD_WS_BASE_URL;
const JAMBONZ_RECORD_WS_USERNAME = process.env.JAMBONZ_RECORD_WS_USERNAME || process.env.JAMBONES_RECORD_WS_USERNAME;
const JAMBONZ_RECORD_WS_PASSWORD = process.env.JAMBONZ_RECORD_WS_PASSWORD || process.env.JAMBONES_RECORD_WS_PASSWORD;
const JAMBONZ_DISABLE_DIAL_PAI_HEADER = process.env.JAMBONZ_DISABLE_DIAL_PAI_HEADER || false;
const JAMBONZ_DIAL_PAI_HEADER = process.env.JAMBONZ_DIAL_PAI_HEADER || false;
const JAMBONES_DISABLE_DIRECT_P2P_CALL = process.env.JAMBONES_DISABLE_DIRECT_P2P_CALL || false;
const JAMBONES_EAGERLY_PRE_CACHE_AUDIO = parseInt(process.env.JAMBONES_EAGERLY_PRE_CACHE_AUDIO, 10) || 0;
const JAMBONES_USE_FREESWITCH_TIMER_FD = process.env.JAMBONES_USE_FREESWITCH_TIMER_FD;
const JAMBONES_DIAL_SBC_FOR_REGISTERED_USER = process.env.JAMBONES_DIAL_SBC_FOR_REGISTERED_USER || false;
const JAMBONES_MEDIA_TIMEOUT_MS = process.env.JAMBONES_MEDIA_TIMEOUT_MS || 0;
const JAMBONES_MEDIA_HOLD_TIMEOUT_MS = process.env.JAMBONES_MEDIA_HOLD_TIMEOUT_MS || 0;
// jambonz
const JAMBONES_TRANSCRIBE_EP_DESTROY_DELAY_MS =
process.env.JAMBONES_TRANSCRIBE_EP_DESTROY_DELAY_MS;
module.exports = {
JAMBONES_MYSQL_HOST,
@@ -170,6 +179,7 @@ module.exports = {
JAMBONES_CLUSTER_ID,
PORT,
HTTP_PORT_MAX,
HTTP_IP,
K8S,
K8S_SBC_SIP_SERVICE_NAME,
JAMBONES_SUBNET,
@@ -182,12 +192,13 @@ module.exports = {
AWS_ACCESS_KEY_ID,
AWS_SECRET_ACCESS_KEY,
AWS_SNS_PORT,
AWS_SNS_TOPIC_ARM,
AWS_SNS_TOPIC_ARN,
AWS_SNS_PORT_MAX,
ANCHOR_MEDIA_ALWAYS,
VMD_HINTS_FILE,
JAMBONES_FREESWITCH_MAX_CALL_DURATION_MINS,
JAMBONES_AWS_TRANSCRIBE_USE_GRPC,
LEGACY_CRYPTO,
JWT_SECRET,
@@ -214,7 +225,11 @@ module.exports = {
JAMBONZ_RECORD_WS_BASE_URL,
JAMBONZ_RECORD_WS_USERNAME,
JAMBONZ_RECORD_WS_PASSWORD,
JAMBONZ_DISABLE_DIAL_PAI_HEADER,
JAMBONZ_DIAL_PAI_HEADER,
JAMBONES_DISABLE_DIRECT_P2P_CALL,
JAMBONES_USE_FREESWITCH_TIMER_FD
JAMBONES_USE_FREESWITCH_TIMER_FD,
JAMBONES_DIAL_SBC_FOR_REGISTERED_USER,
JAMBONES_MEDIA_TIMEOUT_MS,
JAMBONES_MEDIA_HOLD_TIMEOUT_MS,
JAMBONES_TRANSCRIBE_EP_DESTROY_DELAY_MS,
};

View File

@@ -3,7 +3,7 @@ const makeTask = require('../../tasks/make_task');
const RestCallSession = require('../../session/rest-call-session');
const CallInfo = require('../../session/call-info');
const {CallDirection, CallStatus} = require('../../utils/constants');
const uuidv4 = require('uuid-random');
const crypto = require('crypto');
const SipError = require('drachtio-srf').SipError;
const { validationResult, body } = require('express-validator');
const { validate } = require('@jambonz/verb-specifications');
@@ -12,8 +12,12 @@ const HttpRequestor = require('../../utils/http-requestor');
const WsRequestor = require('../../utils/ws-requestor');
const RootSpan = require('../../utils/call-tracer');
const dbUtils = require('../../utils/db-utils');
const { mergeSdpMedia, extractSdpMedia } = require('../../utils/sdp-utils');
const { decrypt } = require('../../utils/encrypt-decrypt');
const { mergeSdpMedia, extractSdpMedia, removeVideoSdp } = require('../../utils/sdp-utils');
const { createCallSchema, customSanitizeFunction } = require('../schemas/create-call');
const { selectHostPort } = require('../../utils/network');
const { JAMBONES_DIAL_SBC_FOR_REGISTERED_USER } = require('../../config');
const { createMediaEndpoint } = require('../../utils/media-endpoint');
const removeNullProperties = (obj) => (Object.keys(obj).forEach((key) => obj[key] === null && delete obj[key]), obj);
const removeNulls = (req, res, next) => {
@@ -64,8 +68,8 @@ router.post('/',
const {
lookupAppBySid
} = srf.locals.dbHelpers;
const {getSBC, getFreeswitch} = srf.locals;
const sbcAddress = getSBC();
const {getSBC} = srf.locals;
let sbcAddress = getSBC();
if (!sbcAddress) throw new Error('no available SBCs for outbound call creation');
const target = restDial.to;
const opts = {
@@ -78,7 +82,7 @@ router.post('/',
const {lookupTeamsByAccount, lookupAccountBySid} = srf.locals.dbHelpers;
const account = await lookupAccountBySid(req.body.account_sid);
const accountInfo = await lookupAccountDetails(req.body.account_sid);
const callSid = uuidv4();
const callSid = crypto.randomUUID();
const application = req.body.application_sid ? await lookupAppBySid(req.body.application_sid) : null;
const record_all_calls = account.record_all_calls || (application && application.record_all_calls);
const recordOutputFormat = account.record_format || 'mp3';
@@ -97,7 +101,9 @@ router.post('/',
'X-Trace-ID': rootSpan.traceId,
...(req.body?.application_sid && {'X-Application-Sid': req.body.application_sid}),
...(restDial.fromHost && {'X-Preferred-From-Host': restDial.fromHost}),
...(record_all_calls && {'X-Record-All-Calls': recordOutputFormat})
...(record_all_calls && {'X-Record-All-Calls': recordOutputFormat}),
...(target.proxy && {'X-SIP-Proxy': target.proxy}),
...target.headers
};
switch (target.type) {
@@ -139,11 +145,23 @@ router.post('/',
}
}
// find handling sbc sip for called user
if (JAMBONES_DIAL_SBC_FOR_REGISTERED_USER && target.type === 'user') {
const { registrar} = srf.locals.dbHelpers;
const reg = await registrar.query(target.name);
if (reg) {
sbcAddress = selectHostPort(logger, reg.sbcAddress, 'tcp')[1];
}
//sbc outbound return 404 Notfound to handle case called user is not reigstered.
}
/**
* trunk isn't specified,
* check if from-number matches any existing numbers on Jambonz
* */
if (target.type === 'phone' && !target.trunk) {
const { lookupLcrByAccount} = srf.locals.dbHelpers;
const lcrs = await lookupLcrByAccount(req.body.account_sid);
if (target.type === 'phone' && !target.trunk && lcrs.length == 0) {
const str = restDial.from || '';
const callingNumber = str.startsWith('+') ? str.substring(1) : str;
const voip_carrier_sid = await lookupCarrierByPhoneNumber(req.body.account_sid, callingNumber);
@@ -155,9 +173,7 @@ router.post('/',
}
/* create endpoint for outdial */
const ms = getFreeswitch();
if (!ms) throw new Error('no available Freeswitch for outbound call creation');
const ep = await ms.createEndpoint();
const ep = await createMediaEndpoint(srf, logger);
logger.debug(`createCall: successfully allocated endpoint, sending INVITE to ${sbcAddress}`);
/* launch outdial */
@@ -166,10 +182,14 @@ router.post('/',
let localSdp = ep.local.sdp;
if (req.body.dual_streams) {
dualEp = await ms.createEndpoint();
dualEp = await createMediaEndpoint(srf, logger);
localSdp = mergeSdpMedia(localSdp, dualEp.local.sdp);
}
if (process.env.JAMBONES_VIDEO_CALLS_ENABLED_IN_FS) {
logger.debug('createCall: removing video sdp');
localSdp = removeVideoSdp(localSdp);
ep.modify(localSdp);
}
const connectStream = async(remoteSdp) => {
if (remoteSdp !== sdp) {
sdp = remoteSdp;
@@ -195,10 +215,20 @@ router.post('/',
/**
* create our application object -
* not from the database as per an inbound call,
* but from the provided params in the request
* we merge the inbound call application,
* with the provided app params from the request body
*/
const app = req.body;
try {
if (application?.env_vars && Object.keys(application.env_vars).length > 0) {
restDial.env_vars = JSON.parse(decrypt(application.env_vars));
}
} catch (err) {
logger.info({err}, 'Unable to set env_vars');
}
const app = {
...application,
...req.body
};
/**
* attach our requestor and notifier objects
@@ -207,9 +237,10 @@ router.post('/',
if ('WS' === app.call_hook?.method || /^wss?:/.test(app.call_hook.url)) {
logger.debug({call_hook: app.call_hook}, 'creating websocket for call hook');
app.requestor = new WsRequestor(logger, account.account_sid, app.call_hook, account.webhook_secret) ;
if (app.call_hook.url === app.call_status_hook.url || !app.call_status_hook?.url) {
if (app.call_hook.url === app.call_status_hook?.url || !app.call_status_hook?.url) {
logger.debug('reusing websocket for call status hook');
app.notifier = app.requestor;
app.call_status_hook = app.call_hook;
}
}
else {
@@ -218,7 +249,7 @@ router.post('/',
}
if (!app.notifier && app.call_status_hook) {
app.notifier = new HttpRequestor(logger, account.account_sid, app.call_status_hook, account.webhook_secret);
logger.debug({call_hook: app.call_hook}, 'creating http client for call status hook');
logger.debug({call_status_hook: app.call_status_hook}, 'creating http client for call status hook');
}
else if (!app.notifier) {
logger.debug('creating null call status hook');
@@ -257,6 +288,8 @@ router.post('/',
callId: inviteReq.get('Call-ID'),
accountSid,
traceId: rootSpan.traceId
}, {
...(account.enable_debug_log && {level: 'debug'})
});
app.requestor.logger = app.notifier.logger = sipLogger;
const callInfo = new CallInfo({
@@ -290,6 +323,8 @@ router.post('/',
},
cbProvisional: (prov) => {
const callStatus = prov.body ? CallStatus.EarlyMedia : CallStatus.Ringing;
// Update call-id for sbc outbound INVITE
cs.callInfo.sbcCallid = prov.get('X-CID');
if ([180, 183].includes(prov.status) && prov.body) connectStream(prov.body);
restDial.emit('callStatus', prov.status, !!prov.body);
cs.emit('callStatusChange', {callStatus, sipStatus: prov.status});

View File

@@ -116,8 +116,8 @@ const customSanitizeFunction = (value) => {
/* trims characters at the beginning and at the end of a string */
value = value.trim();
/* Verify strings including 'http' via new URL */
if (value.includes('http')) {
// Only attempt to parse if the whole string is a URL
if (/^https?:\/\/\S+$/.test(value)) {
value = new URL(value).toString();
}
}

View File

@@ -1,5 +1,5 @@
const uuidv4 = require('uuid-random');
const {CallDirection, AllowedSipRecVerbs} = require('./utils/constants');
const crypto = require('crypto');
const {CallDirection, AllowedSipRecVerbs, WS_CLOSE_CODES} = require('./utils/constants');
const {parseSiprecPayload} = require('./utils/siprec-utils');
const CallInfo = require('./session/call-info');
const HttpRequestor = require('./utils/http-requestor');
@@ -15,6 +15,7 @@ const {
JAMBONES_DISABLE_DIRECT_P2P_CALL
} = require('./config');
const { createJambonzApp } = require('./dynamic-apps');
const { decrypt } = require('./utils/encrypt-decrypt');
module.exports = function(srf, logger) {
const {
@@ -45,7 +46,7 @@ module.exports = function(srf, logger) {
logger.info('getAccountDetails - rejecting call due to missing X-Account-Sid header');
return res.send(500);
}
const callSid = req.has('X-Retain-Call-Sid') ? req.get('X-Retain-Call-Sid') : uuidv4();
const callSid = req.has('X-Retain-Call-Sid') ? req.get('X-Retain-Call-Sid') : crypto.randomUUID();
const account_sid = req.get('X-Account-Sid');
req.locals = {callSid, account_sid, callId};
@@ -111,6 +112,14 @@ module.exports = function(srf, logger) {
req.locals.callingNumber = sipURIs[1];
}
}
// Feature server INVITE request pipelines taking time to finish,
// while connecting and fetch application from db and invoking webhook.
// call can be canceled without any handling, so we add a listener here
req.once('cancel', (sipMsg) => {
logger.info(`${callId} got CANCEL request`);
req.locals.canceled = true;
});
next();
}
@@ -187,14 +196,20 @@ module.exports = function(srf, logger) {
const {span} = rootSpan.startChildSpan('lookupAccountDetails');
try {
req.locals.accountInfo = await lookupAccountDetails(account_sid);
req.locals.service_provider_sid = req.locals.accountInfo?.account?.service_provider_sid;
const accountDetail = await lookupAccountDetails(account_sid);
const account = accountDetail?.account;
req.locals.accountInfo = accountDetail;
req.locals.service_provider_sid = account?.service_provider_sid;
span.end();
if (!req.locals.accountInfo.account.is_active) {
if (!account?.is_active) {
logger.info(`Account is inactive or suspended ${account_sid}`);
// TODO: alert
return res.send(503, {headers: {'X-Reason': 'Account exists but is inactive'}});
}
// Change the default log level to debug
if (account?.enable_debug_log) {
req.locals.logger.level = 'debug';
}
logger.debug({accountInfo: req.locals?.accountInfo?.account}, `retrieved account info for ${account_sid}`);
next();
} catch (err) {
@@ -325,12 +340,14 @@ module.exports = function(srf, logger) {
}
// Resolve application.speech_synthesis_voice if it's custom voice
if (app2.speech_synthesis_vendor === 'google' && app2.speech_synthesis_voice.startsWith('custom_')) {
if (app2.speech_synthesis_vendor === 'google' && app2.speech_synthesis_voice?.startsWith('custom_')) {
const arr = /custom_(.*)/.exec(app2.speech_synthesis_voice);
if (arr) {
const google_custom_voice_sid = arr[1];
const [custom_voice] = await lookupGoogleCustomVoice(google_custom_voice_sid);
if (custom_voice) {
//google voice cloning key has size 200kb, jambonz should not resolve the voice here that the app's calling
//webhook will receive big payload, tts-task should resolve the voice later.
if (!custom_voice.use_voice_cloning_key) {
app2.speech_synthesis_voice = {
reportedUsage: custom_voice.reported_usage,
model: custom_voice.model
@@ -340,11 +357,10 @@ module.exports = function(srf, logger) {
}
req.locals.application = app2;
// eslint-disable-next-line no-unused-vars
const {call_hook, call_status_hook, ...appInfo} = app; // mask sensitive data like user/pass on webhook
// eslint-disable-next-line no-unused-vars
const {requestor, notifier, ...loggable} = appInfo;
const {requestor, notifier, env_vars, ...loggable} = appInfo;
logger.info({app: loggable}, `retrieved application for incoming call to ${req.locals.calledNumber}`);
req.locals.callInfo = new CallInfo({
req,
@@ -354,11 +370,14 @@ module.exports = function(srf, logger) {
});
// if transferred call contains callInfo, let update original data to newly created callInfo in this instance.
if (app.transferredCall && app.callInfo) {
req.locals.callInfo.callerName = app.callInfo.callerName;
req.locals.callInfo.from = app.callInfo.from;
req.locals.callInfo.to = app.callInfo.to;
req.locals.callInfo.originatingSipIp = app.callInfo.originatingSipIp;
req.locals.callInfo.originatingSipTrunkName = app.callInfo.originatingSipTrunkName;
const {direction, callerName, from, to, originatingSipIp, originatingSipTrunkName, customerData} = app.callInfo;
req.locals.callInfo.direction = direction;
req.locals.callInfo.callerName = callerName;
req.locals.callInfo.from = from;
req.locals.callInfo.to = to;
req.locals.callInfo.originatingSipIp = originatingSipIp;
req.locals.callInfo.originatingSipTrunkName = originatingSipTrunkName;
if (customerData) req.locals.callInfo.customerData = customerData;
delete app.callInfo;
}
next();
@@ -377,7 +396,7 @@ module.exports = function(srf, logger) {
const {rootSpan, siprec, application:app} = req.locals;
let span;
try {
if (app.tasks && !JAMBONES_MYSQL_REFRESH_TTL) {
if (app.tasks && app.tasks?.length > 0 && !JAMBONES_MYSQL_REFRESH_TTL) {
app.tasks = normalizeJambones(logger, app.tasks).map((tdata) => makeTask(logger, tdata));
if (0 === app.tasks.length) throw new Error('no application provided');
return next();
@@ -407,16 +426,28 @@ module.exports = function(srf, logger) {
...(app.fallback_speech_recognizer_language && {fallback_language: app.fallback_speech_recognizer_language})
}
};
let env_vars;
try {
if (app.env_vars) {
const d_env_vars = JSON.parse(decrypt(app.env_vars));
logger.info(`Setting env_vars: ${Object.keys(d_env_vars)}`); // Only log the keys not the values
env_vars = d_env_vars;
}
} catch (err) {
logger.info({err}, 'Unable to set env_vars');
}
const params = Object.assign(['POST', 'WS'].includes(app.call_hook.method) ? { sip: req.msg } : {},
req.locals.callInfo,
{ service_provider_sid: req.locals.service_provider_sid },
{ defaults });
{ defaults },
{ env_vars }
);
logger.debug({ params }, 'sending initial webhook');
const obj = rootSpan.startChildSpan('performAppWebhook');
span = obj.span;
const b3 = rootSpan.getTracingPropagation();
const httpHeaders = b3 && { b3 };
json = await app.requestor.request('session:new', app.call_hook, params, httpHeaders);
json = await app.requestor.request('session:new', app.call_hook, params, httpHeaders, span);
}
app.tasks = normalizeJambones(logger, json).map((tdata) => makeTask(logger, tdata));
@@ -450,7 +481,7 @@ module.exports = function(srf, logger) {
}).catch((err) => this.logger.info({err}, 'Error generating alert for parsing application'));
logger.info({err}, `Error retrieving or parsing application: ${err?.message}`);
res.send(480, {headers: {'X-Reason': err?.message || 'unknown'}});
app.requestor.close();
app.requestor.close(WS_CLOSE_CODES.GoingAway);
}
}

View File

@@ -45,8 +45,10 @@ class AdultingCallSession extends CallSession {
return this.sd.ep;
}
/* see note above */
set ep(newEp) {}
// When adulting session kicked from conference, replaceEndpoint is a must
set ep(newEp) {
this.sd.ep = newEp;
}
get callSid() {
return this.callInfo.callSid;

View File

@@ -1,6 +1,6 @@
const {CallDirection, CallStatus} = require('../utils/constants');
const parseUri = require('drachtio-srf').parseUri;
const uuidv4 = require('uuid-random');
const crypto = require('crypto');
const {JAMBONES_API_BASE_URL} = require('../config');
/**
* @classdesc Represents the common information for all calls
@@ -32,6 +32,7 @@ class CallInfo {
this.sipStatus = 100;
this.sipReason = 'Trying';
this.callStatus = CallStatus.Trying;
this.sbcCallid = req.get('X-CID');
this.originatingSipIp = req.get('X-Forwarded-For');
this.originatingSipTrunkName = req.get('X-Originating-Carrier');
const {siprec} = req.locals;
@@ -56,7 +57,7 @@ class CallInfo {
// outbound call that is a child of an existing call
const {req, parentCallInfo, to, callSid} = opts;
srf = req.srf;
this.callSid = callSid || uuidv4();
this.callSid = callSid || crypto.randomUUID();
this.parentCallSid = parentCallInfo.callSid;
this.accountSid = parentCallInfo.accountSid;
this.applicationSid = parentCallInfo.applicationSid;
@@ -129,6 +130,7 @@ class CallInfo {
from: this.from,
to: this.to,
callId: this.callId,
sbcCallid: this.sbcCallid,
sipStatus: this.sipStatus,
sipReason: this.sipReason,
callStatus: this.callStatus,

File diff suppressed because it is too large Load Diff

View File

@@ -8,7 +8,8 @@ const CallSession = require('./call-session');
*/
class ConfirmCallSession extends CallSession {
constructor({logger, application, dlg, ep, tasks, callInfo, accountInfo, memberId, confName, rootSpan}) {
// eslint-disable-next-line max-len
constructor({logger, application, dlg, ep, tasks, callInfo, accountInfo, memberId, confName, rootSpan, req, tmpFiles}) {
super({
logger,
application,
@@ -23,6 +24,8 @@ class ConfirmCallSession extends CallSession {
});
this.dlg = dlg;
this.ep = ep;
this.req = req;
this.tmpFiles = tmpFiles;
}
/**

View File

@@ -22,6 +22,12 @@ class InboundCallSession extends CallSession {
this.req = req;
this.res = res;
// if the call was canceled before we got here, handle it
if (this.req.locals.canceled) {
req.locals.logger.info('InboundCallSession: constructor - call was already canceled');
this._onCancel();
}
req.once('cancel', this._onCancel.bind(this));
this.on('callStatusChange', this._notifyCallStatusChange.bind(this));
@@ -70,8 +76,14 @@ class InboundCallSession extends CallSession {
this._hangup('caller');
}
_jambonzHangup() {
this.dlg?.destroy();
_jambonzHangup(reason) {
this.dlg?.destroy({
headers: {
...(reason && {'X-Reason': reason})
}
});
// kill current task or wakeup the call session.
this._callReleased();
}
_hangup(terminatedBy = 'jambonz') {

View File

@@ -1,10 +1,6 @@
const CallSession = require('./call-session');
const {CallStatus} = require('../utils/constants');
const moment = require('moment');
const {parseUri} = require('drachtio-srf');
const { normalizeJambones } = require('@jambonz/verb-specifications');
const makeTask = require('../tasks/make_task');
/**
* @classdesc Subclass of CallSession. This represents a CallSession that is
* created for an outbound call that is initiated via the REST API.
@@ -31,10 +27,13 @@ class RestCallSession extends CallSession {
}
this.on('callStatusChange', this._notifyCallStatusChange.bind(this));
this._notifyCallStatusChange({
callStatus: CallStatus.Trying,
sipStatus: 100,
sipReason: 'Trying'
setImmediate(() => {
this._notifyCallStatusChange({
callStatus: CallStatus.Trying,
sipStatus: 100,
sipReason: 'Trying'
});
});
}
@@ -46,61 +45,9 @@ class RestCallSession extends CallSession {
this.dlg = dlg;
dlg.on('destroy', this._callerHungup.bind(this));
dlg.on('refer', this._onRefer.bind(this));
dlg.on('modify', this._onReinvite.bind(this));
this.wrapDialog(dlg);
}
/**
* global referHook
*/
set referHook(hook) {
this._referHook = hook;
}
/**
* This is invoked when the called party sends REFER to Jambonz.
*/
async _onRefer(req, res) {
if (this._referHook) {
try {
const to = parseUri(req.getParsedHeader('Refer-To').uri);
const by = parseUri(req.getParsedHeader('Referred-By').uri);
const b3 = this.b3;
const httpHeaders = b3 && {b3};
const json = await this.requestor.request('verb:hook', this._referHook, {
...(this.callInfo.toJSON()),
refer_details: {
sip_refer_to: req.get('Refer-To'),
sip_referred_by: req.get('Referred-By'),
sip_user_agent: req.get('User-Agent'),
refer_to_user: to.scheme === 'tel' ? to.number : to.user,
referred_by_user: by.scheme === 'tel' ? by.number : by.user,
referring_call_sid: this.callSid,
referred_call_sid: null,
}
}, httpHeaders);
if (json && Array.isArray(json)) {
const tasks = normalizeJambones(this.logger, json).map((tdata) => makeTask(this.logger, tdata));
if (tasks && tasks.length > 0) {
this.logger.info('RestCallSession:handleRefer received REFER, get new tasks');
this.replaceApplication(tasks);
if (this.wakeupResolver) {
this.wakeupResolver({reason: 'RestCallSession: referHook new taks'});
this.wakeupResolver = null;
}
}
}
res.send(202);
this.logger.info('RestCallSession:handleRefer - sent 202 Accepted');
} catch (err) {
this.logger.error({err}, 'RestCallSession:handleRefer - error while asking referHook');
res.send(err.statusCode || 501);
}
} else {
res.send(501);
}
}
/**
* This is invoked when the called party hangs up, in order to calculate the call duration.
*/
@@ -119,7 +66,7 @@ class RestCallSession extends CallSession {
this.callInfo.callTerminationBy = terminatedBy;
const duration = moment().diff(this.dlg.connectTime, 'seconds');
this.emit('callStatusChange', {callStatus: CallStatus.Completed, duration});
this.logger.debug(`RestCallSession: called party hung up by ${terminatedBy}`);
this.logger.info(`RestCallSession: called party hung up by ${terminatedBy}`);
this._callReleased();
}

View File

@@ -45,12 +45,11 @@ class SipRecCallSession extends InboundCallSession {
async answerSipRecCall() {
try {
this.ms = this.getMS();
let remoteSdp = this.sdp1.replace(/sendonly/, 'sendrecv');
this.ep = await this.ms.createEndpoint({remoteSdp});
this.ep = await this._createMediaEndpoint({remoteSdp});
//this.logger.debug({remoteSdp, localSdp: this.ep.local.sdp}, 'SipRecCallSession - allocated first endpoint');
remoteSdp = this.sdp2.replace(/sendonly/, 'sendrecv');
this.ep2 = await this.ms.createEndpoint({remoteSdp});
this.ep2 = await this._createMediaEndpoint({remoteSdp});
//this.logger.debug({remoteSdp, localSdp: this.ep2.local.sdp}, 'SipRecCallSession - allocated second endpoint');
await this.ep.bridge(this.ep2);
const combinedSdp = await createSipRecPayload(this.ep.local.sdp, this.ep2.local.sdp, this.logger);

31
lib/tasks/alert.js Normal file
View File

@@ -0,0 +1,31 @@
const Task = require('./task');
const {TaskName} = require('../utils/constants');
class TaskAlert extends Task {
constructor(logger, opts, parentTask) {
super(logger, opts);
this.message = this.data.message;
}
get name() { return TaskName.Alert; }
async exec(cs) {
const {srf, accountSid:account_sid, callSid:target_sid, applicationSid:application_sid} = cs;
const {writeAlerts, AlertType} = srf.locals;
await super.exec(cs);
writeAlerts({
account_sid,
alert_type: AlertType.APPLICATION,
detail: `Application SID ${application_sid}`,
message: this.message,
target_sid
}).catch((err) => this.logger.info({err}, 'Error generating alert application'));
}
async kill(cs) {
super.kill(cs);
this.notifyTaskDone();
}
}
module.exports = TaskAlert;

View File

@@ -49,7 +49,8 @@ class Conference extends Task {
this.confName = this.data.name;
[
'beep', 'startConferenceOnEnter', 'endConferenceOnExit', 'joinMuted',
'maxParticipants', 'waitHook', 'statusHook', 'endHook', 'enterHook', 'endConferenceDuration'
'maxParticipants', 'waitHook', 'statusHook', 'endHook', 'enterHook',
'endConferenceDuration', 'distributeDtmf'
].forEach((attr) => this[attr] = this.data[attr]);
this.record = this.data.record || {};
this.statusEvents = [];
@@ -83,7 +84,11 @@ class Conference extends Task {
// reset answer time if we were transferred from another feature server
if (this.connectTime) dlg.connectTime = this.connectTime;
if (cs.sipRequestWithinDialogHook) {
/* remove any existing listener to escape from duplicating events */
this._removeSipIndialogRequestListener(this.dlg);
this._initSipIndialogRequestListener(cs, dlg);
}
this.ep.on('destroy', this._kicked.bind(this, cs, dlg));
try {
@@ -103,6 +108,7 @@ class Conference extends Task {
this.logger.debug(`Conference:exec - conference ${this.confName} is over`);
if (this.callMoved !== false) await this.performAction(this.results);
this._removeSipIndialogRequestListener(dlg);
} catch (err) {
this.logger.info(err, `TaskConference:exec - error in conference ${this.confName}`);
}
@@ -118,7 +124,9 @@ class Conference extends Task {
this.emitter.emit('kill');
await this._doFinalMemberCheck(cs);
if (this.ep && this.ep.connected) {
this.ep.conn.removeAllListeners('esl::event::CUSTOM::*');
// drachtio-fsmrf override esl::event::CUSTOM to conference join listerner, After finish the conference
// the application need to reset the esl::event::CUSTOM for another use on the same endpoint
this.ep.resetEslCustomEvent();
this.ep.api(`conference ${this.confName} kick ${this.memberId}`)
.catch((err) => this.logger.info({err}, 'Error kicking participant'));
}
@@ -135,15 +143,10 @@ class Conference extends Task {
* @param {SipDialog} dlg
*/
async _init(cs, dlg) {
const friendlyName = this.confName;
const {createHash, retrieveHash} = cs.srf.locals.dbHelpers;
this.friendlyName = this.confName;
this.confName = `conf:${cs.accountSid}:${this.confName}`;
this.statusParams = Object.assign({
conferenceSid: this.confName,
friendlyName
}, cs.callInfo);
// check if conference is in progress
const obj = await retrieveHash(this.confName);
if (obj) {
@@ -354,6 +357,7 @@ class Conference extends Task {
//https://developer.signalwire.com/freeswitch/FreeSWITCH-Explained/Modules/mod_conference_3965534/
// mute | Enter conference muted
...((this.joinMuted || this.speakOnlyTo) && {mute: true}),
...(this.distributeDtmf && {'dist-dtmf': true})
}});
/**
@@ -419,6 +423,20 @@ class Conference extends Task {
}
}
_initSipIndialogRequestListener(cs, dlg) {
dlg.on('info', this._onRequestWithinDialog.bind(this, cs));
dlg.on('message', this._onRequestWithinDialog.bind(this, cs));
}
_removeSipIndialogRequestListener(dlg) {
dlg && dlg.removeAllListeners('message');
dlg && dlg.removeAllListeners('info');
}
_onRequestWithinDialog(cs, req, res) {
cs._onRequestWithinDialog(req, res);
}
/**
* The conference we have been waiting for has started.
* It may be on this server or a different one, and we are
@@ -493,7 +511,7 @@ class Conference extends Task {
}
async doConferenceParticipantAction(cs, opts) {
const {action, tag} = opts;
const {action, tag, wait_hook } = opts;
switch (action) {
case 'tag':
@@ -509,7 +527,10 @@ class Conference extends Task {
await this.clearCoachMode();
break;
case 'hold':
this.doConferenceHold(cs, {conf_hold_status: 'hold'});
this.doConferenceHold(cs, {
conf_hold_status: 'hold',
...(wait_hook && {wait_hook})
});
break;
case 'unhold':
this.doConferenceHold(cs, {conf_hold_status: 'unhold'});
@@ -546,6 +567,13 @@ class Conference extends Task {
} while (!this.killed && this.conf_hold_status === 'hold');
}
/**
* mute or unmute side of the call
*/
mute(callSid, doMute) {
this.doConferenceMute(this.callSession, {conf_mute_status: doMute});
}
/**
* Add ourselves to the waitlist of sessions to be notified once
* the conference starts
@@ -604,7 +632,7 @@ class Conference extends Task {
* when we hang up as the last member, the current member count = 1
* when we are kicked out of the call when the moderator leaves, the member count = 0
*/
if (this.participantCount === 0) {
if (this.participantCount === 0 || this.endConferenceOnExit) {
const {deleteKey} = cs.srf.locals.dbHelpers;
try {
this._notifyConferenceEvent(cs, 'end');
@@ -612,7 +640,8 @@ class Conference extends Task {
this.logger.info(`conf ${this.confName} deprovisioned: ${removed ? 'success' : 'failure'}`);
}
catch (err) {
this.logger.error(err, `Error deprovisioning conference ${this.confName}`);
this.logger.error(err, `Error deprovisioning conference ${this.confName},
might be the conference already cleaned by another moderator`);
}
}
}
@@ -645,7 +674,9 @@ class Conference extends Task {
memberId: this.memberId,
confName: this.confName,
tasks,
rootSpan: cs.rootSpan
rootSpan: cs.rootSpan,
req: cs.req,
tmpFiles: cs.tmpFiles,
});
await this._playSession.exec();
this._playSession = null;
@@ -689,8 +720,24 @@ class Conference extends Task {
if (!params.time) params.time = (new Date()).toISOString();
if (!params.members && typeof this.participantCount === 'number') params.members = this.participantCount;
cs.application.requestor
.request('verb:hook', this.statusHook, Object.assign(params, this.statusParams, httpHeaders))
.catch((err) => this.logger.info(err, 'Conference:notifyConferenceEvent - error'));
.request(
'verb:hook',
this.statusHook,
Object.assign(
params,
Object.assign(
{
conferenceSid: this.confName,
friendlyName: this.friendlyName,
},
cs.callInfo.toJSON()
),
httpHeaders
)
)
.catch((err) =>
this.logger.info(err, 'Conference:notifyConferenceEvent - error')
);
}
}

View File

@@ -16,12 +16,18 @@ class TaskConfig extends Task {
'fillerNoise',
'actionHookDelayAction',
'boostAudioSignal',
'vad'
'vad',
'ttsStream',
'autoStreamTts',
'disableTtsCache'
].forEach((k) => this[k] = this.data[k] || {});
if ('notifyEvents' in this.data) {
this.notifyEvents = !!this.data.notifyEvents;
}
if (this.hasNotifySttLatency) {
this.notifySttLatency = !!this.data.notifySttLatency;
}
if (this.bargeIn.enable) {
this.gatherOpts = {
@@ -34,7 +40,8 @@ class TaskConfig extends Task {
'finishOnKey', 'input', 'numDigits', 'minDigits', 'maxDigits',
'interDigitTimeout', 'bargein', 'dtmfBargein', 'minBargeinWordCount', 'actionHook'
].forEach((k) => {
if (this.bargeIn[k]) this.gatherOpts[k] = this.bargeIn[k];
const val = this.bargeIn[k];
if (val !== undefined && val !== null) this.gatherOpts[k] = val;
});
}
if (this.transcribe?.enable) {
@@ -44,6 +51,12 @@ class TaskConfig extends Task {
};
delete this.transcribeOpts.enable;
}
if (this.ttsStream.enable) {
this.sayOpts = {
verb: 'say',
stream: true
};
}
if (this.data.reset) {
if (typeof this.data.reset === 'string') this.data.reset = [this.data.reset];
@@ -73,6 +86,10 @@ class TaskConfig extends Task {
get hasDub() { return Object.keys(this.dub).length; }
get hasVad() { return Object.keys(this.vad).length; }
get hasFillerNoise() { return Object.keys(this.fillerNoise).length; }
get hasReferHook() { return Object.keys(this.data).includes('referHook'); }
get hasNotifySttLatency() { return Object.keys(this.data).includes('notifySttLatency'); }
get hasTtsStream() { return Object.keys(this.ttsStream).length; }
get hasDisableTtsCache() { return Object.keys(this.data).includes('disableTtsCache'); }
get summary() {
const phrase = [];
@@ -82,13 +99,13 @@ class TaskConfig extends Task {
if (this.bargeIn.enable) phrase.push('enable barge-in');
if (this.hasSynthesizer) {
const {vendor:v, language:l, voice} = this.synthesizer;
const s = `{${v},${l},${voice}}`;
const {vendor:v, language:l, voice, label} = this.synthesizer;
const s = `{${v},${l},${voice},${label || 'None'}}`;
phrase.push(`set synthesizer${s}`);
}
if (this.hasRecognizer) {
const {vendor:v, language:l} = this.recognizer;
const s = `{${v},${l}}`;
const {vendor:v, language:l, label} = this.recognizer;
const s = `{${v},${l},${label || 'None'}}`;
phrase.push(`set recognizer${s}`);
}
if (this.hasRecording) phrase.push(this.record.action);
@@ -101,8 +118,16 @@ class TaskConfig extends Task {
if (this.hasFillerNoise) phrase.push(`fillerNoise ${this.fillerNoise.enable ? 'on' : 'off'}`);
if (this.data.amd) phrase.push('enable amd');
if (this.notifyEvents) phrase.push(`event notification ${this.notifyEvents ? 'on' : 'off'}`);
if (this.hasNotifySttLatency) phrase.push(
`notifySttLatency ${this.notifySttLatency ? 'on' : 'off'}`);
if (this.onHoldMusic) phrase.push(`onHoldMusic: ${this.onHoldMusic}`);
if ('boostAudioSignal' in this.data) phrase.push(`setGain ${this.data.boostAudioSignal}`);
if (this.hasReferHook) phrase.push('set referHook');
if (this.hasTtsStream) {
phrase.push(`${this.ttsStream.enable ? 'enable' : 'disable'} ttsStream`);
}
if ('autoStreamTts' in this.data) phrase.push(`enable Say.stream value ${this.data.autoStreamTts ? 'on' : 'off'}`);
if (this.hasDisableTtsCache) phrase.push(`disableTtsCache ${this.data.disableTtsCache ? 'on' : 'off'}`);
return `${this.name}{${phrase.join(',')}}`;
}
@@ -114,6 +139,11 @@ class TaskConfig extends Task {
cs.notifyEvents = !!this.data.notifyEvents;
}
if (this.hasNotifySttLatency) {
this.logger.debug(`turning notifySttLatency ${this.notifySttLatency ? 'on' : 'off'}`);
cs.notifySttLatencyEnabled = this.notifySttLatency;
}
if (this.onHoldMusic) {
cs.onHoldMusic = this.onHoldMusic;
}
@@ -125,7 +155,7 @@ class TaskConfig extends Task {
try {
this.ep = ep;
this.startAmd(cs, ep, this, this.data.amd);
await this.startAmd(cs, ep, this, this.data.amd);
} catch (err) {
this.logger.info({err}, 'Config:exec - Error calling startAmd');
}
@@ -173,18 +203,20 @@ class TaskConfig extends Task {
: cs.speechRecognizerVendor;
cs.speechRecognizerLabel = this.recognizer.label === 'default'
? cs.speechRecognizerLabel : this.recognizer.label;
cs.speechRecognizerLanguage = this.recognizer.language !== 'default'
cs.speechRecognizerLanguage = this.recognizer.language !== undefined && this.recognizer.language !== 'default'
? this.recognizer.language
: cs.speechRecognizerLanguage;
//fallback
cs.fallbackSpeechRecognizerVendor = this.recognizer.fallbackVendor !== 'default'
cs.fallbackSpeechRecognizerVendor = this.recognizer.fallbackVendor !== undefined &&
this.recognizer.fallbackVendor !== 'default'
? this.recognizer.fallbackVendor
: cs.fallbackSpeechRecognizerVendor;
cs.fallbackSpeechRecognizerLabel = this.recognizer.fallbackLabel === 'default' ?
cs.fallbackSpeechRecognizerLabel :
this.recognizer.fallbackLabel;
cs.fallbackSpeechRecognizerLanguage = this.recognizer.fallbackLanguage !== 'default'
cs.fallbackSpeechRecognizerLanguage = this.recognizer.fallbackLanguage !== undefined &&
this.recognizer.fallbackLanguage !== 'default'
? this.recognizer.fallbackLanguage
: cs.fallbackSpeechRecognizerLanguage;
@@ -280,6 +312,11 @@ class TaskConfig extends Task {
});
}
if ('autoStreamTts' in this.data) {
this.logger.info(`Config: autoStreamTts set to ${this.data.autoStreamTts}`);
cs.autoStreamTts = this.data.autoStreamTts;
}
if (this.hasFillerNoise) {
const {enable, ...opts} = this.fillerNoise;
this.logger.info({fillerNoise: this.fillerNoise}, 'Config: fillerNoise');
@@ -295,9 +332,39 @@ class TaskConfig extends Task {
voiceMs: this.vad.voiceMs || 250,
silenceMs: this.vad.silenceMs || 150,
strategy: this.vad.strategy || 'one-shot',
mode: this.vad.mod || 2
mode: (this.vad.mode !== undefined && this.vad.mode !== null) ? this.vad.mode : 2,
vendor: this.vad.vendor || 'silero',
threshold: this.vad.threshold || 0.5,
speechPadMs: this.vad.speechPadMs || 30,
};
}
if (this.hasReferHook) {
cs.referHook = this.data.referHook;
}
if (this.ttsStream.enable && this.sayOpts) {
this.sayOpts.synthesizer = this.hasSynthesizer ? this.synthesizer : {
vendor: cs.speechSynthesisVendor,
language: cs.speechSynthesisLanguage,
voice: cs.speechSynthesisVoice,
...(cs.speechSynthesisLabel && {
label: cs.speechSynthesisLabel
})
};
this.logger.info({opts: this.gatherOpts}, 'Config: enabling ttsStream');
cs.enableBackgroundTtsStream(this.sayOpts);
}
// only disable ttsStream if it specifically set to false
else if (this.ttsStream.enable === false) {
this.logger.info('Config: disabling ttsStream');
cs.disableTtsStream();
}
if (this.hasDisableTtsCache) {
this.logger.info(`set disableTtsCache = ${this.disableTtsCache}`);
cs.disableTtsCache = this.data.disableTtsCache;
}
}
async kill(cs) {

View File

@@ -3,8 +3,7 @@ const {TaskName, TaskPreconditions, DequeueResults, BONG_TONE} = require('../uti
const Emitter = require('events');
const bent = require('bent');
const assert = require('assert');
const sleepFor = (ms) => new Promise((resolve) => setTimeout(() => resolve(), ms));
const { sleepFor } = require('../utils/helpers');
const getUrl = (cs) => `${cs.srf.locals.serviceUrl}/v1/dequeue/${cs.callSid}`;

View File

@@ -6,6 +6,7 @@ const {
TaskName,
TaskPreconditions,
MAX_SIMRINGS,
MediaPath,
KillReason
} = require('../utils/constants');
const assert = require('assert');
@@ -17,9 +18,13 @@ const dbUtils = require('../utils/db-utils');
const parseDecibels = require('../utils/parse-decibels');
const debug = require('debug')('jambonz:feature-server');
const {parseUri} = require('drachtio-srf');
const {ANCHOR_MEDIA_ALWAYS, JAMBONZ_DISABLE_DIAL_PAI_HEADER} = require('../config');
const { isOnhold, isOpusFirst } = require('../utils/sdp-utils');
const {ANCHOR_MEDIA_ALWAYS,
JAMBONZ_DIAL_PAI_HEADER,
JAMBONES_DIAL_SBC_FOR_REGISTERED_USER} = require('../config');
const { isOnhold, isOpusFirst, getLeadingCodec } = require('../utils/sdp-utils');
const { normalizeJambones } = require('@jambonz/verb-specifications');
const { selectHostPort } = require('../utils/network');
const { sleepFor } = require('../utils/helpers');
function parseDtmfOptions(logger, dtmfCapture) {
let parentDtmfCollector, childDtmfCollector;
@@ -103,6 +108,8 @@ class TaskDial extends Task {
this.proxy = this.data.proxy;
this.tag = this.data.tag;
this.boostAudioSignal = this.data.boostAudioSignal;
this._mediaPath = MediaPath.FullMedia;
this.forwardPAI = this.data.forwardPAI;
if (this.dtmfHook) {
const {parentDtmfCollector, childDtmfCollector} = parseDtmfOptions(logger, this.data.dtmfCapture || {});
@@ -114,8 +121,9 @@ class TaskDial extends Task {
}
}
if (this.data.listen) {
this.listenTask = makeTask(logger, {'listen': this.data.listen}, this);
const listenData = this.data.listen || this.data.stream;
if (listenData) {
this.listenTask = makeTask(logger, {'listen': listenData }, this);
}
if (this.data.transcribe) {
this.transcribeTask = makeTask(logger, {'transcribe' : this.data.transcribe}, this);
@@ -150,17 +158,22 @@ class TaskDial extends Task {
get canReleaseMedia() {
const keepAnchor = this.data.anchorMedia ||
this.cs.isBackGroundListen ||
this.cs.onHoldMusic ||
ANCHOR_MEDIA_ALWAYS ||
this.listenTask ||
this.dubTasks ||
this.transcribeTask ||
this.startAmd;
this.weAreTranscoding ||
this.cs.isBackGroundListen ||
this.cs.onHoldMusic ||
ANCHOR_MEDIA_ALWAYS ||
this.listenTask ||
this.dubTasks ||
this.transcribeTask ||
this.startAmd;
return !keepAnchor;
}
get shouldExitMediaPathEntirely() {
return this.data.exitMediaPath;
}
get summary() {
if (this.target.length === 1) {
const target = this.target[0];
@@ -181,6 +194,16 @@ class TaskDial extends Task {
async exec(cs) {
await super.exec(cs);
if (this.data.anchorMedia && this.data.exitMediaPath) {
this.logger.info('Dial:exec - incompatible anchorMedia and exitMediaPath are both set, will obey anchorMedia');
delete this.data.exitMediaPath;
}
if (!this.canReleaseMedia && this.data.exitMediaPath) {
this.logger.info(
'Dial:exec - exitMediaPath is set so features such as transcribe and record will not work on this call');
}
try {
if (this.listenTask) {
const {span, ctx} = this.startChildSpan(`nested:${this.listenTask.summary}`);
@@ -203,7 +226,16 @@ class TaskDial extends Task {
else {
this.epOther = cs.ep;
if (this.dialMusic && this.epOther && this.epOther.connected) {
this.epOther.play(this.dialMusic).catch((err) => {});
(async() => {
do {
try {
await this.epOther.play(this.dialMusic);
} catch (err) {
this.logger.error(err, `TaskDial:exec error playing dialMusic ${this.dialMusic}`);
await sleepFor(1000);
}
} while (!this.killed && !this.bridged && this._mediaPath === MediaPath.FullMedia);
})();
}
}
if (!this.killed) await this._attemptCalls(cs);
@@ -241,21 +273,37 @@ class TaskDial extends Task {
}
this._removeDtmfDetection(cs.dlg);
this._removeDtmfDetection(this.dlg);
await this._killOutdials();
try {
await this._killOutdials();
}
catch (err) {
this.logger.info({err}, 'Dial:kill - error killing outdials');
}
if (this.sd) {
this.sd.kill();
const byeReasonHeader = this.killReason === KillReason.MediaTimeout ? 'Media Timeout' : undefined;
this.sd.kill(byeReasonHeader);
this.sd.ep?.removeListener('destroy', this._handleMediaTimeout.bind(this));
this.sd.removeAllListeners();
this.sd = null;
}
if (this.callSid) sessionTracker.remove(this.callSid);
if (this.listenTask) {
await this.listenTask.kill(cs);
this.listenTask.span.end();
try {
await this.listenTask.kill(cs);
this.listenTask?.span?.end();
}
catch (err) {
this.logger.error({err}, 'Dial:kill - error killing listen task');
}
this.listenTask = null;
}
if (this.transcribeTask) {
await this.transcribeTask.kill(cs);
this.transcribeTask.span.end();
try {
await this.transcribeTask.kill(cs);
this.transcribeTask?.span?.end();
} catch (err) {
this.logger.error({err}, 'Dial:kill - error killing transcribe task');
}
this.transcribeTask = null;
}
this.notifyTaskDone();
@@ -289,7 +337,7 @@ class TaskDial extends Task {
if (!cs.callGone && this.epOther) {
/* if we can release the media back to the SBC, do so now */
if (this.canReleaseMedia) this._releaseMedia(cs, this.sd);
if (this.canReleaseMedia) this._releaseMedia(cs, this.sd, this.shouldExitMediaPathEntirely);
else this.epOther.bridge(this.ep);
}
} catch (err) {
@@ -329,20 +377,33 @@ class TaskDial extends Task {
const to = parseUri(req.getParsedHeader('Refer-To').uri);
const by = parseUri(req.getParsedHeader('Referred-By').uri);
const referredBy = req.get('Referred-By');
const userAgent = req.get('User-Agent');
const customHeaders = Object.keys(req.headers)
.filter((h) => h.toLowerCase().startsWith('x-'))
.reduce((acc, h) => {
acc[h] = req.get(h);
return acc;
}, {});
this.logger.info({to}, 'refer to parsed');
const json = await cs.requestor.request('verb:hook', this.referHook, {
...(callInfo.toJSON()),
refer_details: {
sip_refer_to: req.get('Refer-To'),
sip_referred_by: req.get('Referred-By'),
sip_user_agent: req.get('User-Agent'),
refer_to_user: to.scheme === 'tel' ? to.number : to.user,
referred_by_user: by.scheme === 'tel' ? by.number : by.user,
...(referredBy && {sip_referred_by: referredBy}),
...(userAgent && {sip_user_agent: userAgent}),
...(by && {referred_by_user: by.scheme === 'tel' ? by.number : by.user}),
referring_call_sid,
referred_call_sid
referred_call_sid,
...customHeaders
}
}, httpHeaders);
if (json && Array.isArray(json)) {
res.send(202);
this.logger.info('DialTask:handleRefer - sent 202 Accepted');
const returnedInstructions = !!json && Array.isArray(json);
if (returnedInstructions) {
try {
const logger = isChild ? this.logger : this.sd.logger;
const tasks = normalizeJambones(logger, json).map((tdata) => makeTask(this.logger, tdata));
@@ -360,14 +421,23 @@ class TaskDial extends Task {
/* need to update the callSid of the child with its own (new) AdultingCallSession */
sessionTracker.add(adultingSession.callSid, adultingSession);
}
if (this.ep) this.ep.unbridge();
/* if we got the REFER on the parent leg, end the dial task after completing the refer */
if (!isChild) {
logger.info('DialTask:handleRefer - killing dial task after processing REFER on parent leg');
cs.currentTask?.kill(cs, KillReason.ReferComplete);
}
}
} catch (err) {
this.logger.info(err, 'Dial:handleRefer - error setting new application after receiving REFER');
}
}
res.send(202);
this.logger.info('DialTask:handleRefer - sent 202 Accepted');
else {
this.logger.info('DialTask:handleRefer - no tasks returned from referHook, not setting new application');
}
} catch (err) {
this.logger.info({err}, 'DialTask:handleRefer - error processing incoming REFER');
res.send(err.statusCode || 501);
}
}
@@ -453,7 +523,7 @@ class TaskDial extends Task {
dlg && dlg.removeAllListeners('info');
}
async _onRequestWithinDialog(cs, req, res) {
_onRequestWithinDialog(cs, req, res) {
cs._onRequestWithinDialog(req, res);
}
@@ -473,31 +543,41 @@ class TaskDial extends Task {
}
async _attemptCalls(cs) {
const {req, srf} = cs;
const {req, callInfo, direction, srf} = cs;
const {getSBC} = srf.locals;
const {lookupTeamsByAccount, lookupAccountBySid} = srf.locals.dbHelpers;
const {lookupCarrier, lookupCarrierByPhoneNumber} = dbUtils(this.logger, cs.srf);
const sbcAddress = this.proxy || getSBC();
const {lookupCarrier, lookupCarrierByPhoneNumber, lookupVoipCarrierBySid} = dbUtils(this.logger, cs.srf);
let sbcAddress = this.proxy || getSBC();
const teamsInfo = {};
let fqdn;
const forwardPAI = this.forwardPAI ?? JAMBONZ_DIAL_PAI_HEADER; // dial verb overides env var
this.logger.debug(forwardPAI, 'forwardPAI value');
if (!sbcAddress) throw new Error('no SBC found for outbound call');
this.headers = {
'X-Account-Sid': cs.accountSid,
...(req && req.has('X-CID') && {'X-CID': req.get('X-CID')}),
...(req && req.has('P-Asserted-Identity') && !JAMBONZ_DISABLE_DIAL_PAI_HEADER &&
{'P-Asserted-Identity': req.get('P-Asserted-Identity')}),
...(direction === 'outbound' && callInfo.sbcCallid && {'X-CID': callInfo.sbcCallid}),
...(req && forwardPAI && {
...(req.has('P-Asserted-Identity') && {'P-Asserted-Identity': req.get('P-Asserted-Identity')}),
...(req.has('Privacy') && {'Privacy': req.get('Privacy')}),
}),
...(req && req.has('X-Voip-Carrier-Sid') && {'X-Voip-Carrier-Sid': req.get('X-Voip-Carrier-Sid')}),
// Put headers at the end to make sure opt.headers override all default behavior.
...this.headers
};
// default to inband dtmf if not specified
this.inbandDtmfEnabled = cs.inbandDtmfEnabled;
// get calling user from From header
const parsedFrom = req.getParsedHeader('from');
const fromUri = parseUri(parsedFrom.uri);
const opts = {
headers: this.headers,
proxy: `sip:${sbcAddress}`,
callingNumber: this.callerId || req.callingNumber,
callingNumber: this.callerId || fromUri.user,
...(this.callerName && {callingName: this.callerName}),
opusFirst: isOpusFirst(this.cs.ep.remote.sdp)
opusFirst: isOpusFirst(this.cs.ep.remote.sdp),
isVideoCall: this.cs.ep.remote.sdp.includes('m=video')
};
const t = this.target.find((t) => t.type === 'teams');
@@ -541,6 +621,15 @@ class TaskDial extends Task {
this.logger.error({err}, 'Error looking up account by sid');
}
}
// find handling sbc sip for called user
if (JAMBONES_DIAL_SBC_FOR_REGISTERED_USER && t.type === 'user') {
const { registrar } = srf.locals.dbHelpers;
const reg = await registrar.query(t.name);
if (reg) {
sbcAddress = selectHostPort(this.logger, reg.sbcAddress, 'tcp')[1];
}
//sbc outbound return 404 Notfound to handle case called user is not reigstered.
}
if (t.type === 'phone' && t.trunk) {
const voip_carrier_sid = await lookupCarrier(cs.accountSid, t.trunk);
this.logger.info(`Dial:_attemptCalls: selected ${voip_carrier_sid} for requested carrier: ${t.trunk}`);
@@ -553,14 +642,23 @@ class TaskDial extends Task {
* trunk isn't specified,
* check if number matches any existing numbers
* */
if (t.type === 'phone' && !t.trunk) {
const { lookupLcrByAccount} = srf.locals.dbHelpers;
const lcrs = await lookupLcrByAccount(cs.accountSid);
if (t.type === 'phone' && !t.trunk && lcrs.length == 0) {
const str = this.callerId || req.callingNumber || '';
const callingNumber = str.startsWith('+') ? str.substring(1) : str;
const voip_carrier_sid = await lookupCarrierByPhoneNumber(cs.accountSid, callingNumber);
const req_voip_carrier_sid = req.has('X-Voip-Carrier-Sid') ? req.get('X-Voip-Carrier-Sid') : null;
if (voip_carrier_sid) {
this.logger.info(
`Dial:_attemptCalls: selected voip_carrier_sid ${voip_carrier_sid} for callingNumber: ${callingNumber}`);
opts.headers['X-Requested-Carrier-Sid'] = voip_carrier_sid;
// Checking if outbound carrier is different from inbound carrier and has dtmf type tones
if (voip_carrier_sid !== req_voip_carrier_sid) {
const [voipCarrier] = await lookupVoipCarrierBySid(voip_carrier_sid);
this.inbandDtmfEnabled = voipCarrier?.dtmf_type === 'tones';
}
}
}
@@ -579,7 +677,8 @@ class TaskDial extends Task {
rootSpan: cs.rootSpan,
startSpan: this.startSpan.bind(this),
dialTask: this,
onHoldMusic: this.cs.onHoldMusic
onHoldMusic: this.cs.onHoldMusic,
tmpFiles: this.cs.tmpFiles,
});
this.dials.set(sd.callSid, sd);
@@ -601,6 +700,7 @@ class TaskDial extends Task {
dialCallStatus: obj.callStatus,
dialSipStatus: obj.sipStatus,
dialCallSid: sd.callSid,
dialSbcCallid: sd.callInfo.sbcCallid
});
}
switch (obj.callStatus) {
@@ -679,6 +779,9 @@ class TaskDial extends Task {
this.epOther.api('uuid_break', this.epOther.uuid);
this.epOther.bridge(sd.ep);
}
else {
this.logger.error('Dial:_connectSingleDial - no other endpoint to bridge!');
}
this.bridged = true;
}
@@ -716,7 +819,7 @@ class TaskDial extends Task {
// Offhold, time to release media
const newSdp = await this.ep.modify(req.body);
await res.send(200, {body: newSdp});
await this._releaseMedia(this.cs, this.sd);
await this._releaseMedia(this.cs, this.sd, this.shouldExitMediaPathEntirely);
this.isOutgoingLegHold = false;
} else {
this.logger.debug('Dial: _onReinvite receive unhold Request, update media server');
@@ -800,10 +903,14 @@ class TaskDial extends Task {
if (this.parentDtmfCollector) this._installDtmfDetection(cs, cs.dlg);
if (this.childDtmfCollector) this._installDtmfDetection(cs, this.dlg);
if (cs.sipRequestWithinDialogHook) this._initSipIndialogRequestListener(cs, this.dlg);
if (cs.sipRequestWithinDialogHook) {
/* remove any existing listener to escape from duplicating events */
this._removeSipIndialogRequestListener(this.dlg);
this._initSipIndialogRequestListener(cs, this.dlg);
}
if (this.transcribeTask) this.transcribeTask.exec(cs, {ep: this.epOther, ep2:this.ep});
if (this.listenTask) this.listenTask.exec(cs, {ep: this.epOther});
if (this.listenTask) this.listenTask.exec(cs, {ep: this.listenTask.channel === 2 ? this.ep : this.epOther});
if (this.startAmd) {
try {
this.startAmd(cs, this.ep, this, this.data.amd);
@@ -823,9 +930,25 @@ class TaskDial extends Task {
this.logger.info({err}, 'Dial:_selectSingleDial - Error boosting audio signal');
}
}
/* basic determination to see if call is being transcoded */
const codecA = getLeadingCodec(this.epOther.local.sdp);
const codecB = getLeadingCodec(this.ep.remote.sdp);
this.weAreTranscoding = (codecA !== codecB);
if (this.weAreTranscoding) {
this.logger.info(`Dial:_selectSingleDial - transcoding from ${codecA} (A leg) to ${codecB} (B leg)`);
}
/* if we can release the media back to the SBC, do so now */
if (this.canReleaseMedia) setTimeout(this._releaseMedia.bind(this, cs, sd), 200);
if (this.canReleaseMedia || this.shouldExitMediaPathEntirely) {
setTimeout(this._releaseMedia.bind(this, cs, sd, this.shouldExitMediaPathEntirely), 200);
}
this.sd.ep.once('destroy', this._handleMediaTimeout.bind(this));
}
_handleMediaTimeout(evt) {
if (evt?.reason === 'MEDIA_TIMEOUT' && this.sd && this.bridged) {
this.kill(this.cs, KillReason.MediaTimeout);
}
}
_bridgeEarlyMedia(sd) {
@@ -837,22 +960,57 @@ class TaskDial extends Task {
}
}
/* public api */
async updateMediaPath(desiredPath) {
this.logger.info(`Dial:updateMediaPath - ${this._mediaPath} => ${desiredPath}`);
switch (desiredPath) {
case MediaPath.NoMedia:
assert(this._mediaPath !== MediaPath.NoMedia, 'updateMediaPath: already no-media');
await this._releaseMedia(this.cs, this.sd, true);
break;
case MediaPath.PartialMedia:
assert(this._mediaPath !== MediaPath.PartialMedia, 'updateMediaPath: already partial-media');
if (this._mediaPath === MediaPath.FullMedia) {
await this._releaseMedia(this.cs, this.sd, false);
}
else {
// to go from no-media to partial-media we need to go through full-media first
await this.reAnchorMedia(this.cs, this.sd);
await this._releaseMedia(this.cs, this.sd, false);
}
assert(!this.epOther, 'updateMediaPath: epOther should be null');
assert(!this.ep, 'updateMediaPath: ep should be null');
break;
case MediaPath.FullMedia:
assert(this._mediaPath !== MediaPath.FullMedia, 'updateMediaPath: already full-media');
await this.reAnchorMedia(this.cs, this.sd);
break;
default:
assert(false, `updateMediaPath: invalid path request ${desiredPath}`);
}
}
/**
* Release the media from freeswitch
* @param {*} cs
* @param {*} sd
*/
async _releaseMedia(cs, sd) {
async _releaseMedia(cs, sd, releaseEntirely = false) {
assert(cs.ep && sd.ep);
try {
// Wait until we got new SDP from B leg to ofter to A Leg
const aLegSdp = cs.ep.remote.sdp;
await sd.releaseMediaToSBC(aLegSdp, cs.ep.local.sdp);
await sd.releaseMediaToSBC(aLegSdp, cs.ep.local.sdp, releaseEntirely);
const bLegSdp = sd.dlg.remote.sdp;
await cs.releaseMediaToSBC(bLegSdp);
await cs.releaseMediaToSBC(bLegSdp, releaseEntirely);
this.epOther = null;
this.logger.info('Dial:_releaseMedia - successfully released media from freewitch');
this._mediaPath = releaseEntirely ? MediaPath.NoMedia : MediaPath.PartialMedia;
this.logger.info(
`Dial:_releaseMedia - successfully released media from freeswitch, media path is now ${this._mediaPath}`);
} catch (err) {
this.logger.info({err}, 'Dial:_releaseMedia error');
}
@@ -861,9 +1019,15 @@ class TaskDial extends Task {
async reAnchorMedia(cs, sd) {
if (cs.ep && sd.ep) return;
this.logger.info('Dial:reAnchorMedia - re-anchoring media to freewitch');
await Promise.all([sd.reAnchorMedia(), cs.reAnchorMedia()]);
this.logger.info('Dial:reAnchorMedia - re-anchoring media to freeswitch');
await Promise.all([sd.reAnchorMedia(this._mediaPath), cs.reAnchorMedia(this._mediaPath)]);
this.epOther = cs.ep;
this.epOther.bridge(this.ep);
this._mediaPath = MediaPath.FullMedia;
this.logger.info(
`Dial:_releaseMedia - successfully re-anchored media to freeswitch, media path is now ${this._mediaPath}`);
}
// Handle RE-INVITE hold from caller leg.
@@ -882,11 +1046,12 @@ class TaskDial extends Task {
}
this._onHoldHook(req);
} else if (!isOnhold(req.body)) {
if (this.epOther && this.ep && this.isIncomingLegHold && this.canReleaseMedia) {
if (this.epOther && this.ep && this.isIncomingLegHold &&
(this.canReleaseMedia || this.shouldExitMediaPathEntirely)) {
// Offhold, time to release media
const newSdp = await this.epOther.modify(req.body);
await res.send(200, {body: newSdp});
await this._releaseMedia(this.cs, this.sd);
await this._releaseMedia(this.cs, this.sd, this.shouldExitMediaPathEntirely);
isHandled = true;
}
this.isIncomingLegHold = false;
@@ -945,7 +1110,9 @@ class TaskDial extends Task {
callInfo: this.cs.callInfo,
accountInfo: this.cs.accountInfo,
tasks,
rootSpan: this.cs.rootSpan
rootSpan: this.cs.rootSpan,
req: this.cs.req,
tmpFiles: this.cs.tmpFiles,
});
await this._onHoldSession.exec();
this._onHoldSession = null;

View File

@@ -83,7 +83,8 @@ class TaskDub extends TtsTask {
action: 'playOnTrack',
track: this.track,
play: this.play,
loop: this.loop ? 'loop' : 'once',
// drachtio-fsmrf will convert loop from boolean to 'loop' or 'once'
loop: this.loop,
gain: this.gain
});
}

View File

@@ -369,7 +369,9 @@ class TaskEnqueue extends Task {
callInfo: cs.callInfo,
accountInfo: cs.accountInfo,
tasks: tasksToRun,
rootSpan: cs.rootSpan
rootSpan: cs.rootSpan,
req: cs.req,
tmpFiles: cs.tmpFiles,
});
await this._playSession.exec();
this._playSession = null;

View File

@@ -5,14 +5,21 @@ const {
AwsTranscriptionEvents,
AzureTranscriptionEvents,
DeepgramTranscriptionEvents,
GladiaTranscriptionEvents,
SonioxTranscriptionEvents,
CobaltTranscriptionEvents,
IbmTranscriptionEvents,
NvidiaTranscriptionEvents,
JambonzTranscriptionEvents,
AssemblyAiTranscriptionEvents,
HoundifyTranscriptionEvents,
DeepgramfluxTranscriptionEvents,
VoxistTranscriptionEvents,
CartesiaTranscriptionEvents,
OpenAITranscriptionEvents,
VadDetection,
VerbioTranscriptionEvents
VerbioTranscriptionEvents,
SpeechmaticsTranscriptionEvents
} = require('../utils/constants.json');
const {
JAMBONES_GATHER_EARLY_HINTS_MATCH,
@@ -22,6 +29,8 @@ const {
const makeTask = require('./make_task');
const assert = require('assert');
const SttTask = require('./stt-task');
const { SpeechCredentialError } = require('../utils/error');
const SPEECHMATICS_DEFAULT_ASR_TIMEOUT = 1200;
class TaskGather extends SttTask {
constructor(logger, opts, parentTask) {
@@ -79,12 +88,15 @@ class TaskGather extends SttTask {
this._bufferedTranscripts = [];
this.partialTranscriptsCount = 0;
this.bugname_prefix = 'gather_';
}
get name() { return TaskName.Gather; }
get needsStt() { return this.input.includes('speech'); }
get isBackgroundGather() { return this.bugname_prefix === 'background_bargeIn_'; }
get wantsSingleUtterance() {
return this.data.recognizer?.singleUtterance === true;
}
@@ -106,6 +118,12 @@ class TaskGather extends SttTask {
return this.fillerNoise.startDelaySecs;
}
get isStreamingTts() { return this.sayTask && this.sayTask.isStreamingTts; }
getTtsVendorData() {
if (this.sayTask) return this.sayTask.getTtsVendorData(this.cs);
}
get summary() {
let s = `${this.name}{`;
if (this.input.length === 2) s += 'inputs=[speech,digits],';
@@ -122,10 +140,26 @@ class TaskGather extends SttTask {
return s;
}
async exec(cs, {ep}) {
async exec(cs, obj) {
try {
await this.handling(cs, obj);
} catch (error) {
if (
// avoid bargein task with sticky will restart forever
// throw exception to stop the loop.
!this.sticky &&
error instanceof SpeechCredentialError) {
this.logger.info('Gather failed due to SpeechCredentialError, finished!');
this.notifyTaskDone();
return;
}
throw error;
}
}
async handling(cs, {ep}) {
this.logger.debug({options: this.data}, 'Gather:exec');
await super.exec(cs, {ep});
const {updateSpeechCredentialLastUsed} = require('../utils/db-utils')(this.logger, cs.srf);
this.fillerNoise = {
...(cs.fillerNoise || {}),
@@ -141,12 +175,23 @@ class TaskGather extends SttTask {
const {hints, hintsBoost} = cs.globalSttHints;
const setOfHints = new Set((this.data.recognizer.hints || [])
.concat(hints)
.filter((h) => typeof h === 'string' && h.length > 0));
// allow for hints to be an array of object
.filter((h) => (typeof h === 'string' && h.length > 0) || (typeof h === 'object')));
this.data.recognizer.hints = [...setOfHints];
if (!this.data.recognizer.hintsBoost && hintsBoost) this.data.recognizer.hintsBoost = hintsBoost;
this.logger.debug({hints: this.data.recognizer.hints, hintsBoost: this.data.recognizer.hintsBoost},
'Gather:exec - applying global sttHints');
}
// specials case for speechmatics: they dont do endpointing so we need to enable continuous ASR
if (this.vendor === 'speechmatics' && !this.isContinuousAsr) {
const maxDelay = this.recognizer?.speechmaticsOptions?.transcription_config?.max_delay;
if (maxDelay) this.asrTimeout = Math.min(SPEECHMATICS_DEFAULT_ASR_TIMEOUT, maxDelay * 1000);
else this.asrTimeout = SPEECHMATICS_DEFAULT_ASR_TIMEOUT;
this.isContinuousAsr = true;
this.logger.debug(`Gather:exec - auto-enabling continuous ASR for speechmatics w/ timeout ${this.asrTimeout}`);
}
if (!this.isContinuousAsr && cs.isContinuousAsr) {
this.isContinuousAsr = true;
this.asrTimeout = cs.asrTimeout * 1000;
@@ -177,14 +222,18 @@ class TaskGather extends SttTask {
this._startVad();
const startDtmfListener = () => {
assert(!this._dtmfListenerStarted);
if (this.input.includes('digits') || this.dtmfBargein || this.asrDtmfTerminationDigit) {
ep.on('dtmf', this._onDtmf.bind(this, cs, ep));
this._dtmfListenerStarted = true;
}
};
const startListening = async(cs, ep) => {
this._startTimer();
if (this.isContinuousAsr && 0 === this.timeout) this._startAsrTimer();
if (this.isContinuousAsr && 0 === this.timeout && !this.isBackgroundGather) {
this._startAsrTimer();
}
if (this.input.includes('speech') && !this.listenDuringPrompt) {
try {
await this._setSpeechHandlers(cs, ep);
@@ -193,7 +242,6 @@ class TaskGather extends SttTask {
return;
}
this._startTranscribing(ep);
return updateSpeechCredentialLastUsed(this.sttCredentials.speech_credential_sid);
} catch (e) {
await this._startFallback(cs, ep, {error: e});
}
@@ -205,6 +253,7 @@ class TaskGather extends SttTask {
const {span, ctx} = this.startChildSpan(`nested:${this.sayTask.summary}`);
const process = () => {
this.logger.debug('Gather: nested say task completed');
this.playComplete = true;
if (!this.listenDuringPrompt) {
startDtmfListener();
}
@@ -221,20 +270,28 @@ class TaskGather extends SttTask {
};
this.sayTask.span = span;
this.sayTask.ctx = ctx;
this.sayTask.exec(cs, {ep}) // kicked off, _not_ waiting for it to complete
this.sayTask
.exec(cs, {ep}) // kicked off, _not_ waiting for it to complete
.then(() => {
if (this.sayTask.isStreamingTts) return;
this.logger.debug('Gather:exec - nested say task completed');
span.end();
process();
return;
})
.catch((err) => {
process();
});
this.sayTask.on('playDone', (err) => {
span.end();
if (err) this.logger.error({err}, 'Gather:exec Error playing tts');
if (this.sayTask.isStreamingTts && !this.sayTask.closeOnStreamEmpty) {
// if streaming tts, we do not wait for it to complete if it is not closing the stream automatically
process();
});
}
}
else if (this.playTask) {
const {span, ctx} = this.startChildSpan(`nested:${this.playTask.summary}`);
const process = () => {
this.logger.debug('Gather: nested play task completed');
this.playComplete = true;
if (!this.listenDuringPrompt) {
startDtmfListener();
}
@@ -251,15 +308,17 @@ class TaskGather extends SttTask {
};
this.playTask.span = span;
this.playTask.ctx = ctx;
this.playTask.exec(cs, {ep}) // kicked off, _not_ waiting for it to complete
this.playTask
.exec(cs, {ep}) // kicked off, _not_ waiting for it to complete
.then(() => {
this.logger.debug('Gather:exec - nested play task completed');
span.end();
process();
return;
})
.catch((err) => {
process();
});
this.playTask.on('playDone', (err) => {
span.end();
if (err) this.logger.error({err}, 'Gather:exec Error playing url');
process();
});
}
else {
if (this.killed) {
@@ -273,15 +332,14 @@ class TaskGather extends SttTask {
await this._setSpeechHandlers(cs, ep);
if (!this.resolved && !this.killed) {
this._startTranscribing(ep);
updateSpeechCredentialLastUsed(this.sttCredentials.speech_credential_sid)
.catch(() => {/*already logged error */});
}
else {
this.logger.info('Gather:exec - task was killed or resolved quickly, not starting transcription');
}
}
if (this.listenDuringPrompt) {
// https://github.com/jambonz/jambonz-feature-server/issues/913
if (this.listenDuringPrompt || (!this.sayTask && !this.playTask)) {
startDtmfListener();
}
@@ -304,6 +362,7 @@ class TaskGather extends SttTask {
this.sayTask?.span.end();
this._stopVad();
this._resolve('killed');
cs.stopSttLatencyVad();
}
updateTaskInProgress(opts) {
@@ -319,12 +378,25 @@ class TaskGather extends SttTask {
_onDtmf(cs, ep, evt) {
this.logger.debug(evt, 'TaskGather:_onDtmf');
if (!this._timeoutTimer && this.timeout > 0) {
this._startTimer();
}
clearTimeout(this.interDigitTimer);
let resolved = false;
if (this.dtmfBargein) {
if (!this.playComplete) {
this.notifyStatus({event: 'dtmf-bargein-detected', ...evt});
}
this._killAudio(cs);
this.emit('dtmf', evt);
}
if (this.isContinuousAsr && evt.dtmf === this.asrDtmfTerminationDigit && this._bufferedTranscripts.length > 0) {
this.logger.info(`continuousAsr triggered with dtmf ${this.asrDtmfTerminationDigit}`);
this._clearAsrTimer();
this._clearTimer();
this._startFinalAsrTimer();
return;
}
if (evt.dtmf === this.finishOnKey && this.input.includes('digits')) {
resolved = true;
this._resolve('dtmf-terminator-key');
@@ -333,12 +405,8 @@ class TaskGather extends SttTask {
if (this.digitBuffer.length === 0 && this.needsStt) {
// DTMF is higher priority than STT.
this.removeCustomEventListeners();
ep.stopTranscription({
vendor: this.vendor,
bugname: this.bugname,
})
.catch((err) => this.logger.error({err},
` Received DTMF, Error stopping transcription for vendor ${this.vendor}`));
this._clearAsrTimer(); //clear ASR timer as we're now using dtmf
this._stopTranscribing(ep);
}
this.digitBuffer += evt.dtmf;
const len = this.digitBuffer.length;
@@ -347,18 +415,12 @@ class TaskGather extends SttTask {
this._resolve('dtmf-num-digits');
}
}
else if (this.isContinuousAsr && evt.dtmf === this.asrDtmfTerminationDigit) {
this.logger.info(`continuousAsr triggered with dtmf ${this.asrDtmfTerminationDigit}`);
this._clearAsrTimer();
this._clearTimer();
this._startFinalAsrTimer();
return;
}
if (!resolved && this.interDigitTimeout > 0 && this.digitBuffer.length >= this.minDigits) {
/* start interDigitTimer */
const ms = this.interDigitTimeout * 1000;
this.logger.debug(`starting interdigit timer of ${ms}`);
this.interDigitTimer = setTimeout(() => this._resolve('dtmf-interdigit-timeout'), ms);
this._clearTimer(); //clear main timer as we're now using interdigit dtmf timer
}
}
@@ -416,6 +478,28 @@ class TaskGather extends SttTask {
this.addCustomEventListener(ep, DeepgramTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
break;
case 'deepgramflux':
this.bugname = `${this.bugname_prefix}deepgramflux_transcribe`;
this.addCustomEventListener(
ep, DeepgramfluxTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
this.addCustomEventListener(
ep, DeepgramfluxTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
break;
case 'gladia':
this.bugname = `${this.bugname_prefix}gladia_transcribe`;
this.addCustomEventListener(
ep, GladiaTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
break;
case 'soniox':
@@ -494,6 +578,83 @@ class TaskGather extends SttTask {
this.addCustomEventListener(ep, AssemblyAiTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
break;
case 'houndify':
this.bugname = `${this.bugname_prefix}houndify_transcribe`;
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Error,
this._onVendorError.bind(this, cs, ep));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
break;
case 'voxist':
this.bugname = `${this.bugname_prefix}voxist_transcribe`;
this.addCustomEventListener(ep, VoxistTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep));
this.addCustomEventListener(
ep, VoxistTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, VoxistTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
this.addCustomEventListener(ep, VoxistTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
break;
case 'cartesia':
this.bugname = `${this.bugname_prefix}cartesia_transcribe`;
this.addCustomEventListener(ep, CartesiaTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep));
this.addCustomEventListener(
ep, CartesiaTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, CartesiaTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
this.addCustomEventListener(ep, CartesiaTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
break;
case 'speechmatics':
this.bugname = `${this.bugname_prefix}speechmatics_transcribe`;
this.addCustomEventListener(
ep, SpeechmaticsTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
this.addCustomEventListener(ep, SpeechmaticsTranscriptionEvents.Info,
this._onSpeechmaticsInfo.bind(this, cs, ep));
this.addCustomEventListener(ep, SpeechmaticsTranscriptionEvents.RecognitionStarted,
this._onSpeechmaticsRecognitionStarted.bind(this, cs, ep));
this.addCustomEventListener(ep, SpeechmaticsTranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, SpeechmaticsTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, SpeechmaticsTranscriptionEvents.Error,
this._onSpeechmaticsErrror.bind(this, cs, ep));
break;
case 'openai':
this.bugname = `${this.bugname_prefix}openai_transcribe`;
this.addCustomEventListener(
ep, OpenAITranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
this.addCustomEventListener(
ep, OpenAITranscriptionEvents.SpeechStarted, this._onOpenAISpeechStarted.bind(this, cs, ep));
this.addCustomEventListener(
ep, OpenAITranscriptionEvents.SpeechStopped, this._onOpenAISpeechStopped.bind(this, cs, ep));
this.addCustomEventListener(ep, OpenAITranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, OpenAITranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, OpenAITranscriptionEvents.Error,
this._onOpenAIErrror.bind(this, cs, ep));
/* openai delta transcripts are useful only for minBargeinWordCount eval */
if (this.minBargeinWordCount > 1) {
this.openaiPartials = [];
opts.OPENAI_WANT_PARTIALS = 1;
this.addCustomEventListener(
ep, OpenAITranscriptionEvents.PartialTranscript, this._onOpenAIPartialTranscript.bind(this, cs, ep));
}
this.modelSupportsConversationTracking = opts.OPENAI_MODEL !== 'whisper-1';
break;
default:
if (this.vendor.startsWith('custom:')) {
this.bugname = `${this.bugname_prefix}${this.vendor}_transcribe`;
@@ -518,13 +679,32 @@ class TaskGather extends SttTask {
}
_startTranscribing(ep) {
this.logger.debug({
this.logger.info({
vendor: this.vendor,
locale: this.language,
interim: this.interim,
bugname: this.bugname
}, 'Gather:_startTranscribing');
/* special feature for openai: we can provide a prompt that includes recent conversation history */
let prompt;
if (this.vendor === 'openai') {
if (this.modelSupportsConversationTracking) {
prompt = this.formatOpenAIPrompt(this.cs, {
prompt: this.data.recognizer?.openaiOptions?.prompt,
hintsTemplate: this.data.recognizer?.openaiOptions?.promptTemplates?.hintsTemplate,
// eslint-disable-next-line max-len
conversationHistoryTemplate: this.data.recognizer?.openaiOptions?.promptTemplates?.conversationHistoryTemplate,
hints: this.data.recognizer?.hints,
});
this.logger.debug({prompt}, 'Gather:_startTranscribing - created an openai prompt');
}
else if (this.data.recognizer?.hints?.length > 0) {
prompt = this.data.recognizer?.hints.join(', ');
}
}
/**
* Note: we don't need to ask deepgram for interim results, because they
* already send us words as they are finalized (is_final=true) even before
@@ -536,6 +716,7 @@ class TaskGather extends SttTask {
interim: this.interim,
bugname: this.bugname,
hostport: this.hostport,
prompt
}).catch((err) => {
const {writeAlerts, AlertType} = this.cs.srf.locals;
this.logger.error(err, 'TaskGather:_startTranscribing error');
@@ -543,17 +724,22 @@ class TaskGather extends SttTask {
account_sid: this.cs.accountSid,
alert_type: AlertType.STT_FAILURE,
vendor: this.vendor,
detail: err.message
detail: err.message,
target_sid: this.cs.callSid
});
}).catch((err) => this.logger.info({err}, 'Error generating alert for tts failure'));
// Some vendor use single connection, that we cannot use onConnect event to track transcription start
this.cs.emit('transcribe-start');
}
_startTimer() {
if (0 === this.timeout) return;
this.logger.debug(`Starting timoutTimer of ${this.timeout}ms`);
this._clearTimer();
this._timeoutTimer = setTimeout(() => {
// If continuousASR in use then extend by the asr window for more transcripts.
if (this.isContinuousAsr) this._startAsrTimer();
if (this.interDigitTimer) return; // let the inter-digit timer complete
else {
this._resolve(this.digitBuffer.length >= this.minDigits ? 'dtmf-num-digits' : 'timeout');
}
@@ -570,15 +756,24 @@ class TaskGather extends SttTask {
}
_startAsrTimer() {
if (this.vendor === 'deepgram') return; // no need
// Deepgram has a case that UtteranceEnd is not sent to cover the last word end time.
// So we need to wait for the asrTimeout to be sure that the last word is sent.
// if (this.vendor === 'deepgram') return; // no need
assert(this.isContinuousAsr);
this._clearAsrTimer();
this._asrTimer = setTimeout(() => {
this.logger.debug('_startAsrTimer - asr timer went off');
this.logger.info('_startAsrTimer - asr timer went off');
const evt = this.consolidateTranscripts(this._bufferedTranscripts, 1, this.language, this.vendor);
/* special case for speechmatics - keep listening if we dont have any transcripts */
if (this.vendor === 'speechmatics' && this._bufferedTranscripts.length === 0) {
this.logger.debug('Gather:_startAsrTimer - speechmatics, no transcripts yet, keep listening');
this._startAsrTimer();
return;
}
this._resolve(this._bufferedTranscripts.length > 0 ? 'speech' : 'timeout', evt);
}, this.asrTimeout);
this.logger.debug(`_startAsrTimer: set for ${this.asrTimeout}ms`);
this.logger.info(`_startAsrTimer: set for ${this.asrTimeout}ms`);
}
_clearAsrTimer() {
@@ -590,7 +785,7 @@ class TaskGather extends SttTask {
}
_hangupCall() {
this.logger.debug('_hangupCall');
this.logger.debug('TaskGather:_hangupCall');
this.cs.hangup();
}
@@ -613,7 +808,7 @@ class TaskGather extends SttTask {
_startFinalAsrTimer() {
this._clearFinalAsrTimer();
this._finalAsrTimer = setTimeout(() => {
this.logger.debug('_startFinalAsrTimer - final asr timer went off');
this.logger.info('_startFinalAsrTimer - final asr timer went off');
const evt = this.consolidateTranscripts(this._bufferedTranscripts, 1, this.language, this.vendor);
this._resolve(this._bufferedTranscripts.length > 0 ? 'speech' : 'timeout', evt);
}, 1000);
@@ -686,40 +881,59 @@ class TaskGather extends SttTask {
this._fillerNoiseOn = false; // in a race, if we just started audio it may sneak through here
this.ep.api('uuid_break', this.ep.uuid)
.catch((err) => this.logger.info(err, 'Error killing audio'));
cs.clearTtsStream();
}
return;
}
if (this.sayTask && !this.sayTask.killed) {
this.sayTask.removeAllListeners('playDone');
this.sayTask.kill(cs);
this.sayTask = null;
}
if (this.playTask && !this.playTask.killed) {
this.playTask.removeAllListeners('playDone');
this.playTask.kill(cs);
this.playTask = null;
}
this.playComplete = true;
}
_onTranscription(cs, ep, evt, fsEvent) {
// check if we are in graceful shutdown mode
if (ep.gracefulShutdownResolver) {
ep.gracefulShutdownResolver();
}
// make sure this is not a transcript from answering machine detection
const bugname = fsEvent.getHeader('media-bugname');
const finished = fsEvent.getHeader('transcription-session-finished');
this.logger.debug({evt, bugname, finished, vendor: this.vendor}, 'Gather:_onTranscription raw transcript');
if (bugname && this.bugname !== bugname) return;
if (bugname && this.bugname !== bugname) {
this.logger.info(
`Gather:_onTranscription - ignoring transcript from ${bugname} because our bug is ${this.bugname}`);
return;
}
if (finished === 'true') return;
if (this.vendor === 'ibm' && evt?.state === 'listening') return;
// emit an event to the call session to track the time transcription is received
cs.emit('on-transcription');
if (this.vendor === 'deepgram' && evt.type === 'UtteranceEnd') {
/* we will only get this when we have set utterance_end_ms */
if (this._bufferedTranscripts.length === 0) {
this.logger.debug('Gather:_onTranscription - got UtteranceEnd event from deepgram but no buffered transcripts');
this.logger.info('Gather:_onTranscription - got UtteranceEnd event from deepgram but no buffered transcripts');
}
else {
this.logger.debug('Gather:_onTranscription - got UtteranceEnd event from deepgram, return buffered transcript');
evt = this.consolidateTranscripts(this._bufferedTranscripts, 1, this.language, this.vendor);
this._bufferedTranscripts = [];
this._resolve('speech', evt);
const utteranceTime = evt.last_word_end;
// eslint-disable-next-line max-len
if (utteranceTime && this._dgTimeOfLastUnprocessedWord && utteranceTime < this._dgTimeOfLastUnprocessedWord && utteranceTime != -1) {
this.logger.info('Gather:_onTranscription - got UtteranceEnd with unprocessed words, continue listening');
}
else {
this.logger.info('Gather:_onTranscription - got UtteranceEnd from deepgram, return buffered transcript');
evt = this.consolidateTranscripts(this._bufferedTranscripts, 1, this.language, this.vendor);
this._bufferedTranscripts = [];
this._resolve('speech', evt);
}
}
return;
}
@@ -730,7 +944,7 @@ class TaskGather extends SttTask {
evt = this.normalizeTranscription(evt, this.vendor, 1, this.language,
this.shortUtterance, this.data.recognizer.punctuation);
//this.logger.debug({evt, bugname, finished, vendor: this.vendor}, 'Gather:_onTranscription normalized transcript');
this.logger.debug({evt, bugname, finished, vendor: this.vendor}, 'Gather:_onTranscription normalized transcript');
if (evt.alternatives.length === 0) {
this.logger.info({evt}, 'TaskGather:_onTranscription - got empty transcript, continue listening');
@@ -738,8 +952,6 @@ class TaskGather extends SttTask {
}
const confidence = evt.alternatives[0].confidence;
const minConfidence = this.data.recognizer?.minConfidence;
this.logger.debug({evt},
`TaskGather:_onTranscription - confidence (${confidence}), minConfidence (${minConfidence})`);
if (confidence && minConfidence && confidence < minConfidence) {
this.logger.info({evt},
'TaskGather:_onTranscription - Transcript confidence ' +
@@ -788,12 +1000,18 @@ class TaskGather extends SttTask {
}
}
// receive a final transcript, calculate the stt latency for this transcript
const sttLatency = this.cs.calculateSttLatency();
if (!emptyTranscript && sttLatency) {
this.stt_latency_ms += `${sttLatency.stt_latency_ms},`;
}
if (this.isContinuousAsr) {
/* append the transcript and start listening again for asrTimeout */
const t = evt.alternatives[0].transcript;
if (t) {
/* remove trailing punctuation */
if (/[,;:\.!\?]$/.test(t)) {
if (this.vendor !== 'speechmatics' && /[,;:\.!\?]$/.test(t)) {
this.logger.debug('TaskGather:_onTranscription - removing trailing punctuation');
evt.alternatives[0].transcript = t.slice(0, -1);
}
@@ -809,7 +1027,10 @@ class TaskGather extends SttTask {
this._startAsrTimer();
/* some STT engines will keep listening after a final response, so no need to restart */
if (!['soniox', 'aws', 'microsoft', 'deepgram'].includes(this.vendor)) this._startTranscribing(ep);
if (!['soniox', 'aws', 'microsoft', 'deepgram', 'speechmatics'].includes(this.vendor) &&
!this.vendor.startsWith('custom')) {
this._startTranscribing(ep);
}
}
else {
/* this was removed to fix https://github.com/jambonz/jambonz-feature-server/issues/783 */
@@ -852,8 +1073,21 @@ class TaskGather extends SttTask {
if (originalEvent.is_final && evt.alternatives[0].transcript !== '') {
this.logger.debug({evt}, 'Gather:_onTranscription - buffering a completed (partial) deepgram transcript');
this._bufferedTranscripts.push(evt);
this._dgTimeOfLastUnprocessedWord = null;
}
if (evt.alternatives[0].transcript === '') {
emptyTranscript = true;
}
else if (!originalEvent.is_final) {
/* Deepgram: we have unprocessed words-save last word end time so we can later compare to UtteranceEnd */
const words = originalEvent.channel.alternatives[0].words;
if (words?.length > 0) {
this._dgTimeOfLastUnprocessedWord = words.slice(-1)[0].end;
this.logger.debug(
`TaskGather:_onTranscription - saving word end time: ${this._dgTimeOfLastUnprocessedWord}`);
}
}
if (evt.alternatives[0].transcript === '') emptyTranscript = true;
}
if (!emptyTranscript) {
@@ -862,6 +1096,7 @@ class TaskGather extends SttTask {
if (!this.playComplete) {
this.logger.debug({transcript: evt.alternatives[0].transcript}, 'killing audio due to speech');
this.emit('vad');
this.notifyStatus({event: 'speech-bargein-detected', ...evt});
}
this._killAudio(cs);
}
@@ -875,16 +1110,24 @@ class TaskGather extends SttTask {
this.cs.requestor.request('verb:hook', this.partialResultHook, Object.assign({speech: evt},
this.cs.callInfo, httpHeaders));
}
else if (this.vendor === 'deepgramflux' &&
['EagerEndOfTurn', 'TurnResumed'].includes(evt.vendor.evt?.event)) {
this.logger.debug(`Gather:_onTranscription - deepgramflux event detected: ${evt.event}`);
this.performAction({speech: evt, reason: 'speechDetected'}, false);
}
if (this.vendor === 'soniox') {
this._clearTimer();
if (evt.vendor.finalWords.length) {
this.logger.debug({evt}, 'TaskGather:_onTranscription - buffering soniox transcript');
this._sonioxTranscripts.push(evt.vendor.finalWords);
}
}
/* restart asr timer if we get a partial transcript */
if (this.isContinuousAsr) this._startAsrTimer();
// If transcription received, reset timeout timer.
if (this._timeoutTimer && !emptyTranscript) {
this._startTimer();
}
/* restart asr timer if we get a partial transcript (only if the asr timer is already running) */
/* note: https://github.com/jambonz/jambonz-feature-server/issues/866 */
if (this.isContinuousAsr && this._asrTimer) this._startAsrTimer();
}
}
_onEndOfUtterance(cs, ep) {
@@ -919,12 +1162,7 @@ class TaskGather extends SttTask {
async _startFallback(cs, ep, evt) {
if (this.canFallback) {
ep.stopTranscription({
vendor: this.vendor,
bugname: this.bugname
})
.catch((err) => this.logger.error({err}, `Error stopping transcription for primary vendor ${this.vendor}`));
const {updateSpeechCredentialLastUsed} = require('../utils/db-utils')(this.logger, cs.srf);
this._stopTranscribing(ep);
try {
this.logger.debug('gather:_startFallback');
this.notifyError({ msg: 'ASR error',
@@ -933,7 +1171,6 @@ class TaskGather extends SttTask {
this._speechHandlersSet = false;
await this._setSpeechHandlers(cs, ep);
this._startTranscribing(ep);
updateSpeechCredentialLastUsed(this.sttCredentials.speech_credential_sid);
return true;
} catch (error) {
this.logger.info({error}, `There is error while falling back to ${this.fallbackVendor}`);
@@ -961,11 +1198,13 @@ class TaskGather extends SttTask {
if (code === 413 && error === 'Too much speech') return this._resolve('timeout');
}
this.logger.info({evt}, 'TaskGather:_onJambonzError');
const errMessage = evt.error || evt.Message;
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.STT_FAILURE,
message: `Custom speech vendor ${this.vendor} error: ${evt.error}`,
message: `Custom speech vendor ${this.vendor} error: ${errMessage}`,
vendor: this.vendor,
target_sid: cs.callSid
}).catch((err) => this.logger.info({err}, 'Error generating alert for jambonz custom connection failure'));
if (!(await this._startFallback(cs, ep, evt))) {
this.notifyTaskDone();
@@ -979,12 +1218,52 @@ class TaskGather extends SttTask {
}
}
async _onSpeechmaticsErrror(cs, _ep, evt) {
// eslint-disable-next-line no-unused-vars
const {message, ...e} = evt;
this._onVendorError(cs, _ep, {error: JSON.stringify(e)});
}
async _onOpenAIErrror(cs, _ep, evt) {
// eslint-disable-next-line no-unused-vars
const {message, ...e} = evt;
this._onVendorError(cs, _ep, {error: JSON.stringify(e)});
}
async _onOpenAISpeechStarted(cs, _ep, evt) {
this.logger.debug({evt}, 'TaskGather:_onOpenAISpeechStarted');
}
async _onOpenAISpeechStopped(cs, _ep, evt) {
this.logger.debug({evt}, 'TaskGather:_onOpenAISpeechStopped');
}
async _onOpenAIPartialTranscript(cs, _ep, evt) {
if (!this.playComplete) {
const words = evt.delta.split(' ').filter((w) => /[A-Za-z0-0]/.test(w));
this.openaiPartials.push(...words);
this.logger.debug({words, partials: this.openaiPartials, evt}, 'TaskGather:_onOpenAIPartialTranscript - words');
if (this.openaiPartials.length >= this.minBargeinWordCount) {
this.logger.debug({partials: this.openaiPartials}, 'killing audio due to speech (openai)');
this._killAudio(cs);
this.notifyStatus({event: 'speech-bargein-detected', words: this.openaiPartials});
}
}
}
async _onVendorError(cs, _ep, evt) {
super._onVendorError(cs, _ep, evt);
if (!(await this._startFallback(cs, _ep, evt))) {
this._resolve('stt-error', evt);
}
}
async _onSpeechmaticsRecognitionStarted(_cs, _ep, evt) {
this.logger.debug({evt}, 'TaskGather:_onSpeechmaticsRecognitionStarted');
}
async _onSpeechmaticsInfo(_cs, _ep, evt) {
this.logger.debug({evt}, 'TaskGather:_onSpeechmaticsInfo');
}
_onVadDetected(cs, ep) {
if (this.bargein && this.minBargeinWordCount === 0) {
@@ -1012,17 +1291,26 @@ class TaskGather extends SttTask {
}
}
async _stopTranscribing(ep) {
// Fix for https://github.com/jambonz/jambonz-feature-server/issues/1281
// We should immediately call stop transcription from gather
// so that next gather can start transcription immediately
ep.stopTranscription({
vendor: this.vendor,
bugname: this.bugname,
gracefulShutdown: false
})
.catch((err) => {
if (this.resolved) return;
this.logger.error({err}, 'Error stopping transcription');
});
this.cs.emit('transcribe-stop');
}
async _resolve(reason, evt) {
this.logger.debug(`TaskGather:resolve with reason ${reason}`);
this.logger.info({evt}, `TaskGather:resolve with reason ${reason}`);
if (this.needsStt && this.ep && this.ep.connected) {
this.ep.stopTranscription({
vendor: this.vendor,
bugname: this.bugname
})
.catch((err) => {
if (this.resolved) return;
this.logger.error({err}, 'Error stopping transcription');
});
this._stopTranscribing(this.ep);
}
if (this.resolved) {
this.logger.debug('TaskGather:_resolve - already resolved');
@@ -1041,14 +1329,32 @@ class TaskGather extends SttTask {
this._clearAsrTimer();
this._clearFinalAsrTimer();
let sttLatencyMetrics = {};
if (this.needsStt) {
const sttLatency = this.cs.calculateSttLatency();
if (sttLatency) {
this.stt_latency_ms = this.stt_latency_ms.endsWith(',') ?
this.stt_latency_ms.slice(0, -1) : this.stt_latency_ms;
sttLatencyMetrics = {
'stt.latency_ms': this.stt_latency_ms,
'stt.talkspurts': JSON.stringify(sttLatency.talkspurts),
'stt.start_time': sttLatency.stt_start_time,
'stt.stop_time': sttLatency.stt_stop_time,
'stt.usage': sttLatency.stt_usage,
};
}
}
this.span.setAttributes({
channel: 1,
'stt.label': this.label || 'None',
'stt.resolve': reason,
'stt.result': JSON.stringify(evt)
'stt.result': JSON.stringify(evt),
...sttLatencyMetrics
});
if (this.callSession && this.callSession.callGone) {
this.logger.debug('TaskGather:_resolve - call is gone, not invoking web callback');
this.logger.info('TaskGather:_resolve - call is gone, not invoking web callback');
this.notifyTaskDone();
return;
}
@@ -1072,6 +1378,9 @@ class TaskGather extends SttTask {
let returnedVerbs = false;
try {
const latencies = Object.fromEntries(
Object.entries(sttLatencyMetrics).map(([key, value]) => [key.replace('stt.', 'stt_'), value])
);
if (reason.startsWith('dtmf')) {
if (this.parentTask) this.parentTask.emit('dtmf', evt);
else {
@@ -1080,11 +1389,12 @@ class TaskGather extends SttTask {
}
}
else if (reason.startsWith('speech')) {
this.cs.emit('userSaid', evt.alternatives[0].transcript);
if (this.parentTask) this.parentTask.emit('transcription', evt);
else {
this.emit('transcription', evt);
this.logger.debug('TaskGather:_resolve - invoking performAction');
returnedVerbs = await this.performAction({speech: evt, reason: 'speechDetected'});
returnedVerbs = await this.performAction({speech: evt, reason: 'speechDetected', ...latencies});
this.logger.debug({returnedVerbs}, 'TaskGather:_resolve - back from performAction');
}
}
@@ -1092,26 +1402,25 @@ class TaskGather extends SttTask {
if (this.parentTask) this.parentTask.emit('timeout', evt);
else {
this.emit('timeout', evt);
returnedVerbs = await this.performAction({reason: 'timeout'});
returnedVerbs = await this.performAction({reason: 'timeout', ...latencies});
}
}
else if (reason.startsWith('stt-error')) {
if (this.parentTask) this.parentTask.emit('stt-error', evt);
else {
this.emit('stt-error', evt);
returnedVerbs = await this.performAction({reason: 'error', details: evt.error});
returnedVerbs = await this.performAction({reason: 'error', details: evt.error, ...latencies});
}
} else if (reason.startsWith('stt-low-confidence')) {
if (this.parentTask) this.parentTask.emit('stt-low-confidence', evt);
else {
this.emit('stt-low-confidence', evt);
returnedVerbs = await this.performAction({reason: 'stt-low-confidence'});
returnedVerbs = await this.performAction({speech:evt, reason: 'stt-low-confidence', ...latencies});
}
}
} catch (err) { /*already logged error*/ }
// Gather got response from hook, cancel actionHookDelay processing
this.logger.debug('TaskGather:_resolve - checking ahd');
if (this.cs.actionHookDelayProcessor) {
if (returnedVerbs) {
this.logger.debug('TaskGather:_resolve - got response from action hook, cancelling actionHookDelay');

View File

@@ -1,10 +1,21 @@
const Task = require('./task');
const {TaskName, TaskPreconditions, ListenEvents, ListenStatus} = require('../utils/constants');
const {TaskName, TaskPreconditions, ListenEvents, ListenStatus} = require('../utils/constants.json');
const makeTask = require('./make_task');
const moment = require('moment');
const MAX_PLAY_AUDIO_QUEUE_SIZE = 10;
const DTMF_SPAN_NAME = 'dtmf';
function escapeString(str) {
return str
.replace(/\\/g, '\\\\') // Escape backslashes
.replace(/"/g, '\\"') // Escape double quotes
.replace(/[\b]/g, '\\b') // Escape backspace (NOTE: [\b] not \b)
.replace(/\f/g, '\\f') // Escape formfeed
.replace(/\n/g, '\\n') // Escape newlines
.replace(/\r/g, '\\r') // Escape carriage returns
.replace(/\t/g, '\\t'); // Escape tabs
}
class TaskListen extends Task {
constructor(logger, opts, parentTask) {
super(logger, opts);
@@ -16,10 +27,21 @@ class TaskListen extends Task {
this.preconditions = TaskPreconditions.Endpoint;
[
'action', 'auth', 'method', 'url', 'finishOnKey', 'maxLength', 'metadata', 'mixType', 'passDtmf', 'playBeep',
'sampleRate', 'timeout', 'transcribe', 'wsAuth', 'disableBidirectionalAudio'
'action', 'auth', 'method', 'url', 'finishOnKey', 'maxLength', 'mixType', 'passDtmf', 'playBeep',
'sampleRate', 'timeout', 'transcribe', 'wsAuth', 'disableBidirectionalAudio', 'channel'
].forEach((k) => this[k] = this.data[k]);
//Escape JSON special characters in metadata
if (this.data.metadata) {
this.metadata = {};
for (const key in this.data.metadata) {
if (this.data.metadata.hasOwnProperty(key)) {
const value = this.data.metadata[key];
this.metadata[key] = typeof value === 'string' ? escapeString(value) : value;
}
}
}
this.mixType = this.mixType || 'mono';
this.sampleRate = this.sampleRate || 8000;
this.earlyMedia = this.data.earlyMedia === true;
@@ -72,7 +94,7 @@ class TaskListen extends Task {
} catch (err) {
this.logger.info(err, `TaskListen:exec - error ${this.url}`);
}
if (this.transcribeTask) this.transcribeTask.kill();
if (this.transcribeTask) this.transcribeTask.kill(cs);
this._removeListeners(ep);
}
@@ -103,9 +125,12 @@ class TaskListen extends Task {
this.notifyTaskDone();
}
async updateListen(status) {
async updateListen(status, silence = false) {
if (!this.killed && this.ep && this.ep.connected) {
const args = this._bugname ? [this._bugname] : [];
const args = [
...(this._bugname ? [this._bugname] : []),
...(status === ListenStatus.Pause ? ([silence]) : []),
];
this.logger.info(`TaskListen:updateListen status ${status}`);
switch (status) {
case ListenStatus.Pause:
@@ -221,7 +246,7 @@ class TaskListen extends Task {
}
}
_onConnect(ep) {
this.logger.debug('TaskListen:_onConnect');
this.logger.info('TaskListen:_onConnect');
}
_onConnectFailure(ep, evt) {
this.logger.info(evt, 'TaskListen:_onConnectFailure');

144
lib/tasks/llm/index.js Normal file
View File

@@ -0,0 +1,144 @@
const Task = require('../task');
const {TaskPreconditions} = require('../../utils/constants');
const TaskLlmOpenAI_S2S = require('./llms/openai_s2s');
const TaskLlmVoiceAgent_S2S = require('./llms/voice_agent_s2s');
const TaskLlmUltravox_S2S = require('./llms/ultravox_s2s');
const TaskLlmElevenlabs_S2S = require('./llms/elevenlabs_s2s');
const TaskLlmGoogle_S2S = require('./llms/google_s2s');
const LlmMcpService = require('../../utils/llm-mcp');
class TaskLlm extends Task {
constructor(logger, opts) {
super(logger, opts);
this.preconditions = TaskPreconditions.Endpoint;
['vendor', 'model', 'auth', 'connectOptions'].forEach((prop) => {
this[prop] = this.data[prop];
});
this.eventHandlers = [];
// delegate to the specific llm model
this.llm = this.createSpecificLlm();
// MCP
this.mcpServers = this.data.mcpServers || [];
}
get name() { return this.llm.name ; }
get toolHook() { return this.llm?.toolHook; }
get eventHook() { return this.llm?.eventHook; }
get ep() { return this.cs.ep; }
get mcpService() {
return this.llmMcpService;
}
get isMcpEnabled() {
return this.mcpServers.length > 0;
}
async exec(cs, {ep}) {
await super.exec(cs, {ep});
// create the MCP service if we have MCP servers
if (this.isMcpEnabled) {
this.llmMcpService = new LlmMcpService(this.logger, this.mcpServers);
await this.llmMcpService.init();
}
await this.llm.exec(cs, {ep});
}
async kill(cs) {
super.kill(cs);
await this.llm.kill(cs);
// clean up MCP clients
if (this.isMcpEnabled) {
await this.mcpService.close();
}
}
createSpecificLlm() {
let llm;
switch (this.vendor) {
case 'openai':
case 'microsoft':
llm = new TaskLlmOpenAI_S2S(this.logger, this.data, this);
break;
case 'voiceagent':
case 'deepgram':
llm = new TaskLlmVoiceAgent_S2S(this.logger, this.data, this);
break;
case 'ultravox':
llm = new TaskLlmUltravox_S2S(this.logger, this.data, this);
break;
case 'elevenlabs':
llm = new TaskLlmElevenlabs_S2S(this.logger, this.data, this);
break;
case 'google':
llm = new TaskLlmGoogle_S2S(this.logger, this.data, this);
break;
default:
throw new Error(`Unsupported vendor ${this.vendor} for LLM`);
}
if (!llm) {
throw new Error(`Unsupported vendor:model ${this.vendor}:${this.model}`);
}
return llm;
}
addCustomEventListener(ep, event, handler) {
this.eventHandlers.push({ep, event, handler});
ep.addCustomEventListener(event, handler);
}
removeCustomEventListeners() {
this.eventHandlers.forEach((h) => h.ep.removeCustomEventListener(h.event, h.handler));
}
async sendEventHook(data) {
await this.cs?.requestor.request('llm:event', this.eventHook, data);
}
async sendToolHook(tool_call_id, data) {
const tool_response = await this.cs?.requestor.request('llm:tool-call', this.toolHook, {tool_call_id, ...data});
// if the toolHook was a websocket it will return undefined, otherwise it should return an object
if (typeof tool_response != 'undefined') {
tool_response.type = 'client_tool_result';
tool_response.invocation_id = tool_call_id;
this.processToolOutput(tool_call_id, tool_response);
}
}
async processToolOutput(tool_call_id, data) {
if (!this.ep.connected) {
this.logger.info('TaskLlm:processToolOutput - no connected endpoint');
return;
}
this.llm.processToolOutput(this.ep, tool_call_id, data);
}
async processLlmUpdate(data, callSid) {
if (this.ep.connected) {
if (typeof this.llm.processLlmUpdate === 'function') {
this.llm.processLlmUpdate(this.ep, data, callSid);
}
else {
const {vendor, model} = this.llm;
this.logger.info({data, callSid},
`TaskLlm:_processLlmUpdate: LLM ${vendor}:${model} does not support llm:update`);
}
}
}
}
module.exports = TaskLlm;

View File

@@ -0,0 +1,327 @@
const Task = require('../../task');
const TaskName = 'Llm_Elevenlabs_s2s';
const {LlmEvents_Elevenlabs} = require('../../../utils/constants');
const {request} = require('undici');
const ClientEvent = 'client.event';
const SessionDelete = 'session.delete';
const elevenlabs_server_events = [
'conversation_initiation_metadata',
'user_transcript',
'agent_response',
'client_tool_call'
];
const expandWildcards = (events) => {
const expandedEvents = [];
events.forEach((evt) => {
if (evt.endsWith('.*')) {
const prefix = evt.slice(0, -2); // Remove the wildcard ".*"
const matchingEvents = elevenlabs_server_events.filter((e) => e.startsWith(prefix));
expandedEvents.push(...matchingEvents);
} else {
expandedEvents.push(evt);
}
});
return expandedEvents;
};
class TaskLlmElevenlabs_S2S extends Task {
constructor(logger, opts, parentTask) {
super(logger, opts, parentTask);
this.parent = parentTask;
this.vendor = this.parent.vendor;
this.auth = this.parent.auth;
const {agent_id, api_key} = this.auth || {};
if (!agent_id) throw new Error('auth.agent_id is required for Elevenlabs S2S');
this.agent_id = agent_id;
this.api_key = api_key;
this.actionHook = this.data.actionHook;
this.eventHook = this.data.eventHook;
this.toolHook = this.data.toolHook;
const {
conversation_initiation_client_data,
input_sample_rate = 16000,
output_sample_rate = 16000
} = this.data.llmOptions;
this.conversation_initiation_client_data = conversation_initiation_client_data;
this.input_sample_rate = input_sample_rate;
this.output_sample_rate = output_sample_rate;
this.results = {
completionReason: 'normal conversation end'
};
/**
* only one of these will have items,
* if includeEvents, then these are the events to include
* if excludeEvents, then these are the events to exclude
*/
this.includeEvents = [];
this.excludeEvents = [];
/* default to all events if user did not specify */
this._populateEvents(this.data.events || elevenlabs_server_events);
this.addCustomEventListener = parentTask.addCustomEventListener.bind(parentTask);
this.removeCustomEventListeners = parentTask.removeCustomEventListeners.bind(parentTask);
}
get name() { return TaskName; }
async getSignedUrl() {
if (!this.api_key) {
return {
host: 'api.elevenlabs.io',
path: `/v1/convai/conversation?agent_id=${this.agent_id}`,
};
}
const {statusCode, body} = await request(
`https://api.elevenlabs.io/v1/convai/conversation/get_signed_url?agent_id=${this.agent_id}`, {
method: 'GET',
headers: {
'xi-api-key': this.api_key
},
}
);
const data = await body.json();
if (statusCode !== 200 || !data?.signed_url) {
this.logger.error({statusCode, data}, 'Elevenlabs Error registering call');
throw new Error(`Elevenlabs Error registering call: ${data.message}`);
}
const url = new URL(data.signed_url);
return {
host: url.hostname,
path: url.pathname + url.search,
};
}
async _api(ep, args) {
const res = await ep.api('uuid_elevenlabs_s2s', `^^|${args.join('|')}`);
if (!res.body?.startsWith('+OK')) {
throw new Error({args}, `Error calling uuid_elevenlabs_s2s: ${res.body}`);
}
}
async exec(cs, {ep}) {
await super.exec(cs);
await this._startListening(cs, ep);
await this.awaitTaskDone();
/* note: the parent llm verb started the span, which is why this is necessary */
await this.parent.performAction(this.results);
this._unregisterHandlers();
}
async kill(cs) {
super.kill(cs);
this._api(cs.ep, [cs.ep.uuid, SessionDelete])
.catch((err) => this.logger.info({err}, 'TaskLlmElevenlabs_S2S:kill - error deleting session'));
this.notifyTaskDone();
}
/**
* Send function call output to the Elevenlabs server in the form of conversation.item.create
* per https://elevenlabs.io/docs/conversational-ai/api-reference/conversational-ai/websocket
*/
async processToolOutput(ep, tool_call_id, rawData) {
try {
const {data} = rawData;
this.logger.debug({tool_call_id, data}, 'TaskLlmElevenlabs_S2S:processToolOutput');
if (!data.type || data.type !== 'client_tool_result') {
this.logger.info({data},
'TaskLlmElevenlabs_S2S:processToolOutput - invalid tool output, must be client_tool_result');
}
else {
await this._api(ep, [ep.uuid, ClientEvent, JSON.stringify(data)]);
}
} catch (err) {
this.logger.info({err}, 'TaskLlmElevenlabs_S2S:processToolOutput');
}
}
/**
* Send a session.update to the Elevenlabs server
* Note: creating and deleting conversation items also supported as well as interrupting the assistant
*/
async processLlmUpdate(ep, data, _callSid) {
this.logger.debug({data, _callSid}, 'TaskLlmElevenlabs_S2S:processLlmUpdate, ignored');
}
async _startListening(cs, ep) {
this._registerHandlers(ep);
try {
const {host, path} = await this.getSignedUrl();
const args = this.conversation_initiation_client_data ?
[ep.uuid, 'session.create', this.input_sample_rate, this.output_sample_rate, host, path] :
[ep.uuid, 'session.create', this.input_sample_rate, this.output_sample_rate, host, path, 'no_initial_config'];
await this._api(ep, args);
} catch (err) {
this.logger.error({err}, 'TaskLlmElevenlabs_S2S:_startListening');
this.notifyTaskDone();
}
}
async _sendClientEvent(ep, obj) {
let ok = true;
this.logger.debug({obj}, 'TaskLlmElevenlabs_S2S:_sendClientEvent');
try {
const args = [ep.uuid, ClientEvent, JSON.stringify(obj)];
await this._api(ep, args);
} catch (err) {
ok = false;
this.logger.error({err}, 'TaskLlmElevenlabs_S2S:_sendClientEvent - Error');
}
return ok;
}
async _sendInitialMessage(ep) {
if (this.conversation_initiation_client_data) {
if (!await this._sendClientEvent(ep, {
type: 'conversation_initiation_client_data',
...this.conversation_initiation_client_data
})) {
this.notifyTaskDone();
}
}
}
_registerHandlers(ep) {
this.addCustomEventListener(ep, LlmEvents_Elevenlabs.Connect, this._onConnect.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_Elevenlabs.ConnectFailure, this._onConnectFailure.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_Elevenlabs.Disconnect, this._onDisconnect.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_Elevenlabs.ServerEvent, this._onServerEvent.bind(this, ep));
}
_unregisterHandlers() {
this.removeCustomEventListeners();
}
_onError(ep, evt) {
this.logger.info({evt}, 'TaskLlmElevenlabs_S2S:_onError');
this.notifyTaskDone();
}
_onConnect(ep) {
this.logger.debug('TaskLlmElevenlabs_S2S:_onConnect');
this._sendInitialMessage(ep);
}
_onConnectFailure(_ep, evt) {
this.logger.info(evt, 'TaskLlmElevenlabs_S2S:_onConnectFailure');
this.results = {completionReason: 'connection failure'};
this.notifyTaskDone();
}
_onDisconnect(_ep, evt) {
this.logger.info(evt, 'TaskLlmElevenlabs_S2S:_onConnectFailure');
this.results = {completionReason: 'disconnect from remote end'};
this.notifyTaskDone();
}
async _onServerEvent(ep, evt) {
let endConversation = false;
const type = evt.type;
this.logger.info({evt}, 'TaskLlmElevenlabs_S2S:_onServerEvent');
if (type === 'error') {
endConversation = true;
this.results = {
completionReason: 'server error',
error: evt.error
};
}
/* tool calls */
else if (type === 'client_tool_call') {
this.logger.debug({evt}, 'TaskLlmElevenlabs_S2S:_onServerEvent - function_call');
const {tool_name: name, tool_call_id: call_id, parameters: args} = evt.client_tool_call;
const mcpTools = this.parent.isMcpEnabled ? await this.parent.mcpService.getAvailableMcpTools() : [];
if (mcpTools.some((tool) => tool.name === name)) {
this.logger.debug({name, args}, 'TaskLlmElevenlabs_S2S:_onServerEvent - calling mcp tool');
try {
const res = await this.parent.mcpService.callMcpTool(name, args);
this.logger.debug({res}, 'TaskLlmElevenlabs_S2S:_onServerEvent - function_call - mcp result');
this.processToolOutput(ep, call_id, {
data: {
type: 'client_tool_result',
tool_call_id: call_id,
result: res.content?.length ? res.content[0] : res.content,
is_error: false
}
});
return;
}
catch (err) {
this.logger.info({err, evt}, 'TaskLlmElevenlabs_S2S - error calling mcp tool');
this.results = {
completionReason: 'client error calling mcp function',
error: err
};
endConversation = true;
}
} else if (!this.toolHook) {
this.logger.warn({evt}, 'TaskLlmElevenlabs_S2S:_onServerEvent - no toolHook defined!');
}
else {
try {
await this.parent.sendToolHook(call_id, {name, args});
} catch (err) {
this.logger.info({err, evt}, 'TaskLlmElevenlabs_S2S - error calling function');
this.results = {
completionReason: 'client error calling function',
error: err
};
endConversation = true;
}
}
}
/* check whether we should notify on this event */
if (this.includeEvents.length > 0 ? this.includeEvents.includes(type) : !this.excludeEvents.includes(type)) {
this.parent.sendEventHook(evt)
.catch((err) => this.logger.info({err},
'TaskLlmElevenlabs_S2S:_onServerEvent - error sending event hook'));
}
if (endConversation) {
this.logger.info({results: this.results},
'TaskLlmElevenlabs_S2S:_onServerEvent - ending conversation due to error');
this.notifyTaskDone();
}
}
_populateEvents(events) {
if (events.includes('all')) {
/* work by excluding specific events */
const exclude = events
.filter((evt) => evt.startsWith('-'))
.map((evt) => evt.slice(1));
if (exclude.length === 0) this.includeEvents = elevenlabs_server_events;
else this.excludeEvents = expandWildcards(exclude);
}
else {
/* work by including specific events */
const include = events
.filter((evt) => !evt.startsWith('-'));
this.includeEvents = expandWildcards(include);
}
this.logger.debug({
includeEvents: this.includeEvents,
excludeEvents: this.excludeEvents
}, 'TaskLlmElevenlabs_S2S:_populateEvents');
}
}
module.exports = TaskLlmElevenlabs_S2S;

View File

@@ -0,0 +1,319 @@
const Task = require('../../task');
const TaskName = 'Llm_Google_s2s';
const {LlmEvents_Google} = require('../../../utils/constants');
const ClientEvent = 'client.event';
const SessionDelete = 'session.delete';
const google_server_events = [
'error',
'session.created',
'session.updated',
];
const expandWildcards = (events) => {
const expandedEvents = [];
events.forEach((evt) => {
if (evt.endsWith('.*')) {
const prefix = evt.slice(0, -2); // Remove the wildcard ".*"
const matchingEvents = google_server_events.filter((e) => e.startsWith(prefix));
expandedEvents.push(...matchingEvents);
} else {
expandedEvents.push(evt);
}
});
return expandedEvents;
};
class TaskLlmGoogle_S2S extends Task {
constructor(logger, opts, parentTask) {
super(logger, opts, parentTask);
this.parent = parentTask;
this.vendor = this.parent.vendor;
this.vendor = this.parent.vendor;
this.model = this.parent.model || 'models/gemini-2.0-flash-live-001';
this.auth = this.parent.auth;
this.connectionOptions = this.parent.connectOptions;
const {apiKey} = this.auth || {};
if (!apiKey) throw new Error('auth.apiKey is required for Google S2S');
this.apiKey = apiKey;
this.actionHook = this.data.actionHook;
this.eventHook = this.data.eventHook;
this.toolHook = this.data.toolHook;
const {setup} = this.data.llmOptions;
if (typeof setup !== 'object') {
throw new Error('llmOptions with an initial setup is required for Google S2S');
}
this.setup = {
...setup,
model: this.model,
// make sure output is always audio
generationConfig: {
...(setup.generationConfig || {}),
responseModalities: 'audio'
}
};
this.results = {
completionReason: 'normal conversation end'
};
/**
* only one of these will have items,
* if includeEvents, then these are the events to include
* if excludeEvents, then these are the events to exclude
*/
this.includeEvents = [];
this.excludeEvents = [];
/* default to all events if user did not specify */
this._populateEvents(this.data.events || google_server_events);
this.addCustomEventListener = parentTask.addCustomEventListener.bind(parentTask);
this.removeCustomEventListeners = parentTask.removeCustomEventListeners.bind(parentTask);
}
get name() { return TaskName; }
async _api(ep, args) {
const res = await ep.api('uuid_google_s2s', `^^|${args.join('|')}`);
if (!res.body?.startsWith('+OK')) {
throw new Error({args}, `Error calling uuid_openai_s2s: ${res.body}`);
}
}
async exec(cs, {ep}) {
await super.exec(cs);
await this._startListening(cs, ep);
await this.awaitTaskDone();
/* note: the parent llm verb started the span, which is why this is necessary */
await this.parent.performAction(this.results);
this._unregisterHandlers();
}
async kill(cs) {
super.kill(cs);
this._api(cs.ep, [cs.ep.uuid, SessionDelete])
.catch((err) => this.logger.info({err}, 'TaskLlmGoogle_S2S:kill - error deleting session'));
this.notifyTaskDone();
}
_populateEvents(events) {
if (events.includes('all')) {
/* work by excluding specific events */
const exclude = events
.filter((evt) => evt.startsWith('-'))
.map((evt) => evt.slice(1));
if (exclude.length === 0) this.includeEvents = google_server_events;
else this.excludeEvents = expandWildcards(exclude);
}
else {
/* work by including specific events */
const include = events
.filter((evt) => !evt.startsWith('-'));
this.includeEvents = expandWildcards(include);
}
this.logger.debug({
includeEvents: this.includeEvents,
excludeEvents: this.excludeEvents
}, 'TaskLlmGoogle_S2S:_populateEvents');
}
async _startListening(cs, ep) {
this._registerHandlers(ep);
try {
const args = [ep.uuid, 'session.create', this.apiKey];
await this._api(ep, args);
} catch (err) {
this.logger.error({err}, 'TaskLlmGoogle_S2S:_startListening');
this.notifyTaskDone();
}
}
async _sendClientEvent(ep, obj) {
let ok = true;
this.logger.debug({obj}, 'TaskLlmGoogle_S2S:_sendClientEvent');
try {
const args = [ep.uuid, ClientEvent, JSON.stringify(obj)];
await this._api(ep, args);
} catch (err) {
ok = false;
this.logger.error({err}, 'TaskLlmGoogle_S2S:_sendClientEvent - Error');
}
return ok;
}
async _sendInitialMessage(ep) {
const setup = this.setup;
const mcpTools = this.parent.isMcpEnabled ? await this.parent.mcpService.getAvailableMcpTools() : [];
if (mcpTools && mcpTools.length > 0) {
const convertedTools = [
{
functionDeclarations: mcpTools.map((tool) => {
if (tool.inputSchema) {
delete tool.inputSchema.additionalProperties;
delete tool.inputSchema['$schema'];
}
return {
name: tool.name,
description: tool.description,
parameters: tool.inputSchema,
};
})
}
];
// merge with any existing tools
setup.tools = [...convertedTools, ...(this.setup.tools || [])];
}
if (!await this._sendClientEvent(ep, {
setup,
})) {
this.logger.debug(this.setup, 'TaskLlmGoogle_S2S:_sendInitialMessage - sending session.update');
this.notifyTaskDone();
}
}
_registerHandlers(ep) {
this.addCustomEventListener(ep, LlmEvents_Google.Connect, this._onConnect.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_Google.ConnectFailure, this._onConnectFailure.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_Google.Disconnect, this._onDisconnect.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_Google.ServerEvent, this._onServerEvent.bind(this, ep));
}
_unregisterHandlers() {
this.removeCustomEventListeners();
}
_onError(ep, evt) {
this.logger.info({evt}, 'TaskLlmGoogle_S2S:_onError');
this.notifyTaskDone();
}
_onConnect(ep) {
this.logger.debug('TaskLlmGoogle_S2S:_onConnect');
this._sendInitialMessage(ep);
}
_onConnectFailure(_ep, evt) {
this.logger.info(evt, 'TaskLlmGoogle_S2S:_onConnectFailure');
this.results = {completionReason: 'connection failure'};
this.notifyTaskDone();
}
_onDisconnect(_ep, evt) {
this.logger.info(evt, 'TaskLlmGoogle_S2S:_onConnectFailure');
this.results = {completionReason: 'disconnect from remote end'};
this.notifyTaskDone();
}
async _onServerEvent(ep, evt) {
let endConversation = false;
this.logger.debug({evt}, 'TaskLlmGoogle_S2S:_onServerEvent');
const {toolCall /**toolCallCancellation*/} = evt;
if (toolCall) {
this.logger.debug({toolCall}, 'TaskLlmGoogle_S2S:_onServerEvent - toolCall');
if (!this.toolHook) {
this.logger.info({evt}, 'TaskLlmGoogle_S2S:_onServerEvent - no toolHook defined!');
}
else {
const {functionCalls} = toolCall;
const mcpTools = this.parent.isMcpEnabled ? await this.parent.mcpService.getAvailableMcpTools() : [];
const functionResponses = [];
if (mcpTools && mcpTools.length > 0) {
for (const functionCall of functionCalls) {
const {name, args, id} = functionCall;
const tool = mcpTools.find((tool) => tool.name === name);
if (tool) {
const response = await this.parent.mcpService.callMcpTool(name, args);
functionResponses.push({
response: {
output: response,
},
id
});
}
}
}
if (functionResponses && functionResponses.length > 0) {
this.logger.debug({functionResponses}, 'TaskLlmGoogle_S2S:_onServerEvent - function_call - mcp result');
this.processToolOutput(ep, 'tool_call_id', {
toolResponse: {
functionResponses
}
});
} else {
try {
await this.parent.sendToolHook('function_call_id', {type: 'toolCall', functionCalls});
} catch (err) {
this.logger.info({err, evt}, 'TaskLlmGoogle_S2S - error calling function');
this.results = {
completionReason: 'client error calling function',
error: err
};
endConversation = true;
}
}
}
}
this._sendLlmEvent('llm_event', evt);
if (endConversation) {
this.logger.info({results: this.results},
'TaskLlmGoogle_S2S:_onServerEvent - ending conversation due to error');
this.notifyTaskDone();
}
}
_sendLlmEvent(type, evt) {
/* check whether we should notify on this event */
if (this.includeEvents.length > 0 ? this.includeEvents.includes(type) : !this.excludeEvents.includes(type)) {
this.parent.sendEventHook(evt)
.catch((err) => this.logger.info({err}, 'TaskLlmGoogle_S2S:_onServerEvent - error sending event hook'));
}
}
async processLlmUpdate(ep, data, _callSid) {
try {
this.logger.debug({data, _callSid}, 'TaskLlmGoogle_S2S:processLlmUpdate');
await this._api(ep, [ep.uuid, ClientEvent, JSON.stringify(data)]);
} catch (err) {
this.logger.info({err, data}, 'TaskLlmGoogle_S2S:processLlmUpdate - Error processing LLM update');
}
}
async processToolOutput(ep, tool_call_id, data) {
try {
this.logger.debug({tool_call_id, data}, 'TaskLlmGoogle_S2S:processToolOutput');
const {toolResponse} = data;
if (!toolResponse) {
this.logger.info({data},
'TaskLlmGoogle_S2S:processToolOutput - invalid tool output, must be functionResponses');
}
else {
await this._api(ep, [ep.uuid, ClientEvent, JSON.stringify(data)]);
}
} catch (err) {
this.logger.info({err, data}, 'TaskLlmGoogle_S2S:processToolOutput - Error processing tool output');
}
}
}
module.exports = TaskLlmGoogle_S2S;

View File

@@ -0,0 +1,398 @@
const Task = require('../../task');
const TaskName = 'Llm_OpenAI_s2s';
const {LlmEvents_OpenAI} = require('../../../utils/constants');
const ClientEvent = 'client.event';
const SessionDelete = 'session.delete';
const openai_server_events = [
'error',
'session.created',
'session.updated',
'conversation.created',
'input_audio_buffer.committed',
'input_audio_buffer.cleared',
'input_audio_buffer.speech_started',
'input_audio_buffer.speech_stopped',
'conversation.item.created',
'conversation.item.input_audio_transcription.completed',
'conversation.item.input_audio_transcription.failed',
'conversation.item.truncated',
'conversation.item.deleted',
'response.created',
'response.done',
'response.output_item.added',
'response.output_item.done',
'response.content_part.added',
'response.content_part.done',
'response.text.delta',
'response.text.done',
'response.audio_transcript.delta',
'response.audio_transcript.done',
'response.audio.delta',
'response.audio.done',
'response.function_call_arguments.delta',
'response.function_call_arguments.done',
'rate_limits.updated',
'output_audio.playback_started',
'output_audio.playback_stopped',
];
const expandWildcards = (events) => {
const expandedEvents = [];
events.forEach((evt) => {
if (evt.endsWith('.*')) {
const prefix = evt.slice(0, -2); // Remove the wildcard ".*"
const matchingEvents = openai_server_events.filter((e) => e.startsWith(prefix));
expandedEvents.push(...matchingEvents);
} else {
expandedEvents.push(evt);
}
});
return expandedEvents;
};
class TaskLlmOpenAI_S2S extends Task {
constructor(logger, opts, parentTask) {
super(logger, opts, parentTask);
this.parent = parentTask;
this.vendor = this.parent.vendor;
this.model = this.parent.model || 'gpt-4o-realtime-preview-2024-12-17';
this.auth = this.parent.auth;
this.connectionOptions = this.parent.connectOptions;
const {apiKey} = this.auth || {};
if (!apiKey) throw new Error('auth.apiKey is required for OpenAI S2S');
if (['openai', 'microsoft'].indexOf(this.vendor) === -1) {
throw new Error(`Invalid vendor ${this.vendor} for OpenAI S2S`);
}
if ('microsoft' === this.vendor && !this.connectionOptions?.host) {
throw new Error('connectionOptions.host is required for Microsoft OpenAI S2S');
}
this.apiKey = apiKey;
this.authType = 'microsoft' === this.vendor ? 'query' : 'bearer';
this.actionHook = this.data.actionHook;
this.eventHook = this.data.eventHook;
this.toolHook = this.data.toolHook;
const {response_create, session_update} = this.data.llmOptions;
if (typeof response_create !== 'object') {
throw new Error('llmOptions with an initial response.create is required for OpenAI S2S');
}
this.response_create = response_create;
this.session_update = session_update;
this.results = {
completionReason: 'normal conversation end'
};
/**
* only one of these will have items,
* if includeEvents, then these are the events to include
* if excludeEvents, then these are the events to exclude
*/
this.includeEvents = [];
this.excludeEvents = [];
/* default to all events if user did not specify */
this._populateEvents(this.data.events || openai_server_events);
this.addCustomEventListener = parentTask.addCustomEventListener.bind(parentTask);
this.removeCustomEventListeners = parentTask.removeCustomEventListeners.bind(parentTask);
}
get name() { return TaskName; }
get host() {
const {host} = this.connectionOptions || {};
return host || (this.vendor === 'openai' ? 'api.openai.com' : void 0);
}
get path() {
const {path} = this.connectionOptions || {};
if (path) return path;
switch (this.vendor) {
case 'openai':
return `v1/realtime?model=${this.model}`;
case 'microsoft':
return `openai/realtime?api-version=2024-10-01-preview&deployment=${this.model}`;
}
}
async _api(ep, args) {
const res = await ep.api('uuid_openai_s2s', `^^|${args.join('|')}`);
if (!res.body?.startsWith('+OK')) {
throw new Error({args}, `Error calling uuid_openai_s2s: ${res.body}`);
}
}
async exec(cs, {ep}) {
await super.exec(cs);
await this._startListening(cs, ep);
await this.awaitTaskDone();
/* note: the parent llm verb started the span, which is why this is necessary */
await this.parent.performAction(this.results);
this._unregisterHandlers();
}
async kill(cs) {
super.kill(cs);
this._api(cs.ep, [cs.ep.uuid, SessionDelete])
.catch((err) => this.logger.info({err}, 'TaskLlmOpenAI_S2S:kill - error deleting session'));
this.notifyTaskDone();
}
/**
* Send function call output to the OpenAI server in the form of conversation.item.create
* per https://platform.openai.com/docs/guides/realtime/function-calls
*/
async processToolOutput(ep, tool_call_id, data) {
try {
this.logger.debug({tool_call_id, data}, 'TaskLlmOpenAI_S2S:processToolOutput');
if (!data.type || data.type !== 'conversation.item.create') {
this.logger.info({data},
'TaskLlmOpenAI_S2S:processToolOutput - invalid tool output, must be conversation.item.create');
}
else {
await this._api(ep, [ep.uuid, ClientEvent, JSON.stringify(data)]);
// spec also recommends to send immediate response.create
await this._api(ep, [ep.uuid, ClientEvent, JSON.stringify({type: 'response.create'})]);
}
} catch (err) {
this.logger.info({err}, 'TaskLlmOpenAI_S2S:processToolOutput');
}
}
/**
* Send a session.update to the OpenAI server
* Note: creating and deleting conversation items also supported as well as interrupting the assistant
*/
async processLlmUpdate(ep, data, _callSid) {
try {
this.logger.debug({data, _callSid}, 'TaskLlmOpenAI_S2S:processLlmUpdate');
if (!data.type || ![
'session.update',
'conversation.item.create',
'conversation.item.delete',
'response.cancel'
].includes(data.type)) {
this.logger.info({data}, 'TaskLlmOpenAI_S2S:processLlmUpdate - invalid mid-call request');
}
else {
await this._api(ep, [ep.uuid, ClientEvent, JSON.stringify(data)]);
}
} catch (err) {
this.logger.info({err}, 'TaskLlmOpenAI_S2S:processLlmUpdate');
}
}
async _startListening(cs, ep) {
this._registerHandlers(ep);
try {
const args = [ep.uuid, 'session.create', this.host, this.path, this.authType, this.apiKey];
await this._api(ep, args);
} catch (err) {
this.logger.error({err}, 'TaskLlmOpenAI_S2S:_startListening');
this.notifyTaskDone();
}
}
async _sendClientEvent(ep, obj) {
let ok = true;
this.logger.debug({obj}, 'TaskLlmOpenAI_S2S:_sendClientEvent');
try {
const args = [ep.uuid, ClientEvent, JSON.stringify(obj)];
await this._api(ep, args);
} catch (err) {
ok = false;
this.logger.error({err}, 'TaskLlmOpenAI_S2S:_sendClientEvent - Error');
}
return ok;
}
async _sendInitialMessage(ep) {
let obj = {type: 'response.create', response: this.response_create};
if (!await this._sendClientEvent(ep, obj)) {
this.notifyTaskDone();
}
/* send immediate session.update if present */
else if (this.session_update) {
if (this.parent.isMcpEnabled) {
this.logger.debug('TaskLlmOpenAI_S2S:_sendInitialMessage - mcp enabled');
const tools = await this.parent.mcpService.getAvailableMcpTools();
if (tools && tools.length > 0 && this.session_update) {
const convertedTools = tools.map((tool) => ({
name: tool.name,
type: 'function',
description: tool.description,
parameters: tool.inputSchema
}));
this.session_update.tools = [
...convertedTools,
...(this.session_update.tools || [])
];
}
}
obj = {type: 'session.update', session: this.session_update};
this.logger.debug({obj}, 'TaskLlmOpenAI_S2S:_sendInitialMessage - sending session.update');
if (!await this._sendClientEvent(ep, obj)) {
this.notifyTaskDone();
}
}
}
_registerHandlers(ep) {
this.addCustomEventListener(ep, LlmEvents_OpenAI.Connect, this._onConnect.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_OpenAI.ConnectFailure, this._onConnectFailure.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_OpenAI.Disconnect, this._onDisconnect.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_OpenAI.ServerEvent, this._onServerEvent.bind(this, ep));
}
_unregisterHandlers() {
this.removeCustomEventListeners();
}
_onError(ep, evt) {
this.logger.info({evt}, 'TaskLlmOpenAI_S2S:_onError');
this.notifyTaskDone();
}
_onConnect(ep) {
this.logger.debug('TaskLlmOpenAI_S2S:_onConnect');
this._sendInitialMessage(ep);
}
_onConnectFailure(_ep, evt) {
this.logger.info(evt, 'TaskLlmOpenAI_S2S:_onConnectFailure');
this.results = {completionReason: 'connection failure'};
this.notifyTaskDone();
}
_onDisconnect(_ep, evt) {
this.logger.info(evt, 'TaskLlmOpenAI_S2S:_onConnectFailure');
this.results = {completionReason: 'disconnect from remote end'};
this.notifyTaskDone();
}
async _onServerEvent(ep, evt) {
let endConversation = false;
const type = evt.type;
this.logger.info({evt}, 'TaskLlmOpenAI_S2S:_onServerEvent');
/* check for failures, such as rate limit exceeded, that should terminate the conversation */
if (type === 'response.done' && evt.response.status === 'failed') {
endConversation = true;
this.results = {
completionReason: 'server failure',
error: evt.response.status_details?.error
};
}
/* server errors of some sort */
else if (type === 'error') {
endConversation = true;
this.results = {
completionReason: 'server error',
error: evt.error
};
}
/* tool calls */
else if (type === 'response.output_item.done' && evt.item?.type === 'function_call') {
this.logger.debug({evt}, 'TaskLlmOpenAI_S2S:_onServerEvent - function_call');
const {name, call_id} = evt.item;
const args = JSON.parse(evt.item.arguments);
const mcpTools = this.parent.isMcpEnabled ? await this.parent.mcpService.getAvailableMcpTools() : [];
if (mcpTools.some((tool) => tool.name === name)) {
this.logger.debug({call_id, name, args}, 'TaskLlmOpenAI_S2S:_onServerEvent - calling mcp tool');
try {
const res = await this.parent.mcpService.callMcpTool(name, args);
this.logger.debug({res}, 'TaskLlmOpenAI_S2S:_onServerEvent - function_call - mcp result');
this.processToolOutput(ep, call_id, {
type: 'conversation.item.create',
item: {
type: 'function_call_output',
call_id,
output: res.content[0]?.text || 'There is no output from the function call',
}
});
return;
} catch (err) {
this.logger.info({err, evt}, 'TaskLlmOpenAI_S2S - error calling function');
this.results = {
completionReason: 'client error calling mcp function',
error: err
};
endConversation = true;
}
}
else if (!this.toolHook) {
this.logger.warn({evt}, 'TaskLlmOpenAI_S2S:_onServerEvent - no toolHook defined!');
}
else {
try {
await this.parent.sendToolHook(call_id, {name, args});
} catch (err) {
this.logger.info({err, evt}, 'TaskLlmOpenAI - error calling function');
this.results = {
completionReason: 'client error calling function',
error: err
};
endConversation = true;
}
}
}
/* check whether we should notify on this event */
if (this.includeEvents.length > 0 ? this.includeEvents.includes(type) : !this.excludeEvents.includes(type)) {
this.parent.sendEventHook(evt)
.catch((err) => this.logger.info({err}, 'TaskLlmOpenAI_S2S:_onServerEvent - error sending event hook'));
}
if (endConversation) {
this.logger.info({results: this.results}, 'TaskLlmOpenAI_S2S:_onServerEvent - ending conversation due to error');
this.notifyTaskDone();
}
}
_populateEvents(events) {
if (events.includes('all')) {
/* work by excluding specific events */
const exclude = events
.filter((evt) => evt.startsWith('-'))
.map((evt) => evt.slice(1));
if (exclude.length === 0) this.includeEvents = openai_server_events;
else this.excludeEvents = expandWildcards(exclude);
}
else {
/* work by including specific events */
const include = events
.filter((evt) => !evt.startsWith('-'));
this.includeEvents = expandWildcards(include);
}
this.logger.debug({
includeEvents: this.includeEvents,
excludeEvents: this.excludeEvents
}, 'TaskLlmOpenAI_S2S:_populateEvents');
}
}
module.exports = TaskLlmOpenAI_S2S;

View File

@@ -0,0 +1,351 @@
const Task = require('../../task');
const TaskName = 'Llm_Ultravox_s2s';
const {request} = require('undici');
const {LlmEvents_Ultravox} = require('../../../utils/constants');
const ultravox_server_events = [
'createCall',
'pong',
'state',
'transcript',
'conversationText',
'clientToolInvocation',
'playbackClearBuffer',
];
const ClientEvent = 'client.event';
const expandWildcards = (events) => {
// no-op for deepgram
return events;
};
const SessionDelete = 'session.delete';
class TaskLlmUltravox_S2S extends Task {
constructor(logger, opts, parentTask) {
super(logger, opts, parentTask);
this.parent = parentTask;
this.vendor = this.parent.vendor;
this.model = this.parent.model || 'fixie-ai/ultravox';
this.auth = this.parent.auth;
this.connectionOptions = this.parent.connectOptions;
const {apiKey, agent_id} = this.auth || {};
if (!apiKey) throw new Error('auth.apiKey is required for Vendor: Ultravox');
this.apiKey = apiKey;
this.agentId = agent_id;
this.actionHook = this.data.actionHook;
this.eventHook = this.data.eventHook;
this.toolHook = this.data.toolHook;
this.llmOptions = this.data.llmOptions || {};
this.results = {
completionReason: 'normal conversation end'
};
/**
* only one of these will have items,
* if includeEvents, then these are the events to include
* if excludeEvents, then these are the events to exclude
*/
this.includeEvents = [];
this.excludeEvents = [];
/* default to all events if user did not specify */
this._populateEvents(this.data.events || ultravox_server_events);
this.addCustomEventListener = parentTask.addCustomEventListener.bind(parentTask);
this.removeCustomEventListeners = parentTask.removeCustomEventListeners.bind(parentTask);
}
get name() { return TaskName; }
async _api(ep, args) {
const res = await ep.api('uuid_ultravox_s2s', `^^|${args.join('|')}`);
if (!res.body?.startsWith('+OK')) {
throw new Error(`Error calling uuid_ultravox_s2s: ${JSON.stringify(res.body)}`);
}
}
/**
* Converts a JSON Schema to the dynamic parameters format used in the Ultravox API
* @param {Object} jsonSchema - A JSON Schema object defining parameters
* @param {string} locationDefault - Default location value for parameters (default: 'PARAMETER_LOCATION_BODY')
* @returns {Array} Array of dynamic parameters objects
*/
transformSchemaToParameters(jsonSchema, locationDefault = 'PARAMETER_LOCATION_BODY') {
if (jsonSchema.properties) {
const required = jsonSchema.required || [];
return Object.entries(jsonSchema.properties).map(([name]) => {
return {
name,
location: locationDefault,
required: required.includes(name)
};
});
}
return [];
}
async createCall() {
const mcpTools = this.parent.isMcpEnabled ? await this.parent.mcpService.getAvailableMcpTools() : [];
if (mcpTools && mcpTools.length > 0) {
const convertedTools = mcpTools.map((tool) => {
return {
temporaryTool: {
modelToolName: tool.name,
description: tool.description,
dynamicParameters: this.transformSchemaToParameters(tool.inputSchema),
// use client tool that ultravox call tool via freeswitch module.
client: {}
}
};
}
);
// merge with any existing tools
this.llmOptions.selectedTools = [
...convertedTools,
...(this.llmOptions.selectedTools || [])
];
}
const payload = {
...this.llmOptions,
...(!this.agentId && {
model: this.model,
}),
medium: {
...(this.llmOptions.medium || {}),
serverWebSocket: {
inputSampleRate: 8000,
outputSampleRate: 8000,
}
}
};
const baseUrl = 'https://api.ultravox.ai';
const url = this.agentId ?
`${baseUrl}/api/agents/${this.agentId}/calls` : `${baseUrl}/api/calls`;
const {statusCode, body} = await request(url, {
method: 'POST',
headers: {
'Content-Type': 'application/json',
'X-API-Key': this.apiKey
},
body: JSON.stringify(payload)
});
const data = await body.json();
if (statusCode !== 201 || !data?.joinUrl) {
this.logger.info({statusCode, data}, 'Ultravox Error registering call');
throw new Error(`Ultravox Error registering call:${statusCode} - ${data.detail}`);
}
this.logger.debug({joinUrl: data.joinUrl}, 'Ultravox Call registered');
return data;
}
_unregisterHandlers() {
this.removeCustomEventListeners();
}
_registerHandlers(ep) {
this.addCustomEventListener(ep, LlmEvents_Ultravox.Connect, this._onConnect.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_Ultravox.ConnectFailure, this._onConnectFailure.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_Ultravox.Disconnect, this._onDisconnect.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_Ultravox.ServerEvent, this._onServerEvent.bind(this, ep));
}
async _startListening(cs, ep) {
this._registerHandlers(ep);
try {
const data = await this.createCall();
const {joinUrl} = data;
// split the joinUrl into host and path
const {host, pathname, search} = new URL(joinUrl);
const args = [ep.uuid, 'session.create', host, pathname + search];
await this._api(ep, args);
// Notify the application that the session has been created with detail information
this._sendLlmEvent('createCall', {
type: 'createCall',
...data
});
} catch (err) {
this.logger.info({err}, 'TaskLlmUltraVox_S2S:_startListening - Error sending createCall');
this.results = {completionReason: `connection failure - ${err}`};
this.notifyTaskDone();
}
}
async exec(cs, {ep}) {
await super.exec(cs);
await this._startListening(cs, ep);
await this.awaitTaskDone();
/* note: the parent llm verb started the span, which is why this is necessary */
await this.parent.performAction(this.results);
this._unregisterHandlers();
}
async kill(cs) {
super.kill(cs);
this._api(cs.ep, [cs.ep.uuid, SessionDelete])
.catch((err) => this.logger.info({err}, 'TaskLlmUltravox_S2S:kill - error deleting session'));
this.notifyTaskDone();
}
_onConnect(ep) {
this.logger.info('TaskLlmUltravox_S2S:_onConnect');
}
_onConnectFailure(_ep, evt) {
this.logger.info(evt, 'TaskLlmUltravox_S2S:_onConnectFailure');
this.results = {completionReason: 'connection failure'};
this.notifyTaskDone();
}
_onDisconnect(_ep, evt) {
this.logger.info(evt, 'TaskLlmUltravox_S2S:_onConnectFailure');
this.results = {completionReason: 'disconnect from remote end'};
this.notifyTaskDone();
}
async _onServerEvent(_ep, evt) {
let endConversation = false;
const type = evt.type;
//this.logger.debug({evt}, 'TaskLlmUltravox_S2S:_onServerEvent');
/* server errors of some sort */
if (type === 'error') {
endConversation = true;
this.results = {
completionReason: 'server error',
error: evt.error
};
}
/* tool calls */
else if (type === 'client_tool_invocation') {
this.logger.debug({evt}, 'TaskLlmUltravox_S2S:_onServerEvent - function_call');
const {toolName: name, invocationId: call_id, parameters: args} = evt;
const mcpTools = this.parent.isMcpEnabled ? await this.parent.mcpService.getAvailableMcpTools() : [];
if (mcpTools.some((tool) => tool.name === name)) {
this.logger.debug({
name,
input: args
}, 'TaskLlmUltravox_S2S:_onServerEvent - function_call - mcp tool');
try {
const res = await this.parent.mcpService.callMcpTool(name, args);
this.logger.debug({res}, 'TaskLlmUltravox_S2S:_onServerEvent - function_call - mcp result');
this.processToolOutput(_ep, call_id, {
type: 'client_tool_result',
invocation_id: call_id,
result: res.content
});
return;
} catch (err) {
this.logger.info({err, evt}, 'TaskLlmUltravox_S2S - error calling mcp tool');
this.results = {
completionReason: 'client error calling mcp function',
error: err
};
endConversation = true;
}
} else if (!this.toolHook) {
this.logger.info({evt}, 'TaskLlmUltravox_S2S:_onServerEvent - no toolHook defined!');
}
else {
try {
await this.parent.sendToolHook(call_id, {name, args});
} catch (err) {
this.logger.info({err, evt}, 'TaskLlmUltravox_S2S - error calling function');
this.results = {
completionReason: 'client error calling function',
error: err
};
endConversation = true;
}
}
}
this._sendLlmEvent(type, evt);
if (endConversation) {
this.logger.info({results: this.results},
'TaskLlmUltravox_S2S:_onServerEvent - ending conversation due to error');
this.notifyTaskDone();
}
}
_sendLlmEvent(type, evt) {
/* check whether we should notify on this event */
if (this.includeEvents.length > 0 ? this.includeEvents.includes(type) : !this.excludeEvents.includes(type)) {
this.parent.sendEventHook(evt)
.catch((err) => this.logger.info({err}, 'TaskLlmUltravox_S2S:_onServerEvent - error sending event hook'));
}
}
async processLlmUpdate(ep, data, _callSid) {
try {
this.logger.debug({data, _callSid}, 'TaskLlmUltravox_S2S:processLlmUpdate');
if (!data.type || ![
'input_text_message'
].includes(data.type)) {
this.logger.info({data},
'TaskLlmUltravox_S2S:processLlmUpdate - invalid mid-call request, only input_text_message supported');
}
else {
await this._api(ep, [ep.uuid, ClientEvent, JSON.stringify(data)]);
}
} catch (err) {
this.logger.info({err, data}, 'TaskLlmUltravox_S2S:processLlmUpdate - Error processing LLM update');
}
}
async processToolOutput(ep, tool_call_id, data) {
try {
this.logger.debug({tool_call_id, data}, 'TaskLlmUltravox_S2S:processToolOutput');
if (!data.type || data.type !== 'client_tool_result') {
this.logger.info({data},
'TaskLlmUltravox_S2S:processToolOutput - invalid tool output, must be client_tool_result');
}
else {
await this._api(ep, [ep.uuid, ClientEvent, JSON.stringify(data)]);
}
} catch (err) {
this.logger.info({err, data}, 'TaskLlmUltravox_S2S:processToolOutput - Error processing tool output');
}
}
_populateEvents(events) {
if (events.includes('all')) {
/* work by excluding specific events */
const exclude = events
.filter((evt) => evt.startsWith('-'))
.map((evt) => evt.slice(1));
if (exclude.length === 0) this.includeEvents = ultravox_server_events;
else this.excludeEvents = expandWildcards(exclude);
}
else {
/* work by including specific events */
const include = events
.filter((evt) => !evt.startsWith('-'));
this.includeEvents = expandWildcards(include);
}
this.logger.debug({
includeEvents: this.includeEvents,
excludeEvents: this.excludeEvents
}, 'TaskLlmUltravox_S2S:_populateEvents');
}
}
module.exports = TaskLlmUltravox_S2S;

View File

@@ -0,0 +1,352 @@
const Task = require('../../task');
const TaskName = 'Llm_VoiceAgent_s2s';
const {LlmEvents_VoiceAgent} = require('../../../utils/constants');
const ClientEvent = 'client.event';
const SessionDelete = 'session.delete';
const va_server_events = [
'Error',
'Welcome',
'SettingsApplied',
'ConversationText',
'UserStartedSpeaking',
'EndOfThought',
'AgentThinking',
'FunctionCallRequest',
'FunctionCalling',
'AgentStartedSpeaking',
'AgentAudioDone',
];
const expandWildcards = (events) => {
// no-op for deepgram
return events;
};
class TaskLlmVoiceAgent_S2S extends Task {
constructor(logger, opts, parentTask) {
super(logger, opts, parentTask);
this.parent = parentTask;
this.vendor = this.parent.vendor;
this.model = this.parent.model || 'voice-agent';
this.auth = this.parent.auth;
this.connectionOptions = this.parent.connectOptions;
const {apiKey} = this.auth || {};
if (!apiKey) throw new Error('auth.apiKey is required for VoiceAgent S2S');
this.apiKey = apiKey;
this.authType = 'bearer';
this.actionHook = this.data.actionHook;
this.eventHook = this.data.eventHook;
this.toolHook = this.data.toolHook;
const {Settings} = this.data.llmOptions;
if (typeof Settings !== 'object') {
throw new Error('llmOptions with an initial Settings is required for VoiceAgent S2S');
}
// eslint-disable-next-line no-unused-vars
const {audio, ...rest} = Settings;
const cfg = this.Settings = rest;
if (!cfg.agent) throw new Error('llmOptions.Settings.agent is required for VoiceAgent S2S');
if (!cfg.agent.think) {
throw new Error('llmOptions.Settings.agent.think is required for VoiceAgent S2S');
}
if (!cfg.agent.think.provider?.model) {
throw new Error('llmOptions.Settings.agent.think.provider.model is required for VoiceAgent S2S');
}
if (!cfg.agent.think.provider?.type) {
throw new Error('llmOptions.Settings.agent.think.provider.type is required for VoiceAgent S2S');
}
this.results = {
completionReason: 'normal conversation end'
};
/**
* only one of these will have items,
* if includeEvents, then these are the events to include
* if excludeEvents, then these are the events to exclude
*/
this.includeEvents = [];
this.excludeEvents = [];
/* default to all events if user did not specify */
this._populateEvents(this.data.events || va_server_events);
this.addCustomEventListener = parentTask.addCustomEventListener.bind(parentTask);
this.removeCustomEventListeners = parentTask.removeCustomEventListeners.bind(parentTask);
}
get name() { return TaskName; }
get host() {
const {host} = this.connectionOptions || {};
return host || 'agent.deepgram.com';
}
get path() {
const {path} = this.connectionOptions || {};
if (path) return path;
return '/v1/agent/converse';
}
async _api(ep, args) {
const res = await ep.api('uuid_voice_agent_s2s', `^^|${args.join('|')}`);
if (!res.body?.startsWith('+OK')) {
throw new Error(`Error calling uuid_voice_agent_s2s: ${JSON.stringify(res.body)}`);
}
}
async exec(cs, {ep}) {
await super.exec(cs);
await this._startListening(cs, ep);
await this.awaitTaskDone();
/* note: the parent llm verb started the span, which is why this is necessary */
await this.parent.performAction(this.results);
this._unregisterHandlers();
}
async kill(cs) {
super.kill(cs);
this._api(cs.ep, [cs.ep.uuid, SessionDelete])
.catch((err) => this.logger.info({err}, 'TaskLlmVoiceAgent_S2S:kill - error deleting session'));
this.notifyTaskDone();
}
/**
* Send function call response to the VoiceAgent server
*/
async processToolOutput(ep, tool_call_id, data) {
try {
const {data:response} = data;
this.logger.debug({tool_call_id, response}, 'TaskLlmVoiceAgent_S2S:processToolOutput');
if (!response.type || response.type !== 'FunctionCallResponse') {
this.logger.info({response},
'TaskLlmVoiceAgent_S2S:processToolOutput - invalid tool output, must be FunctionCallResponse');
}
else {
await this._api(ep, [ep.uuid, ClientEvent, JSON.stringify(response)]);
}
} catch (err) {
this.logger.info({err}, 'TaskLlmVoiceAgent_S2S:processToolOutput');
}
}
/**
* Send a session.update to the VoiceAgent server
* Note: creating and deleting conversation items also supported as well as interrupting the assistant
*/
async processLlmUpdate(ep, data, _callSid) {
try {
this.logger.debug({data, _callSid}, 'TaskLlmVoiceAgent_S2S:processLlmUpdate');
if (!data.type || ![
'UpdateInstructions',
'UpdateSpeak',
'InjectAgentMessage',
].includes(data.type)) {
this.logger.info({data}, 'TaskLlmVoiceAgent_S2S:processLlmUpdate - invalid mid-call request');
}
else {
await this._api(ep, [ep.uuid, ClientEvent, JSON.stringify(data)]);
}
} catch (err) {
this.logger.info({err}, 'TaskLlmVoiceAgent_S2S:processLlmUpdate');
}
}
async _startListening(cs, ep) {
this._registerHandlers(ep);
try {
const args = [ep.uuid, 'session.create', this.host, this.path, this.authType, this.apiKey];
await this._api(ep, args);
} catch (err) {
this.logger.error({err}, `TaskLlmVoiceAgent_S2S:_startListening: ${JSON.stringify(err)}`);
this.notifyTaskDone();
}
}
async _sendClientEvent(ep, obj) {
let ok = true;
this.logger.debug({obj}, 'TaskLlmVoiceAgent_S2S:_sendClientEvent');
try {
const args = [ep.uuid, ClientEvent, JSON.stringify(obj)];
await this._api(ep, args);
} catch (err) {
ok = false;
this.logger.error({err}, 'TaskLlmVoiceAgent_S2S:_sendClientEvent - Error');
}
return ok;
}
async _sendInitialMessage(ep) {
const mcpTools = this.parent.isMcpEnabled ? await this.parent.mcpService.getAvailableMcpTools() : [];
if (mcpTools && mcpTools.length > 0 && this.Settings.agent?.think) {
const convertedTools = mcpTools.map((tool) => ({
name: tool.name,
description: tool.description,
parameters: tool.inputSchema
}));
this.Settings.agent.think.functions = [
...convertedTools,
...(this.Settings.agent.think?.functions || [])
];
}
if (!await this._sendClientEvent(ep, this.Settings)) {
this.notifyTaskDone();
}
}
_registerHandlers(ep) {
this.addCustomEventListener(ep, LlmEvents_VoiceAgent.Connect, this._onConnect.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_VoiceAgent.ConnectFailure, this._onConnectFailure.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_VoiceAgent.Disconnect, this._onDisconnect.bind(this, ep));
this.addCustomEventListener(ep, LlmEvents_VoiceAgent.ServerEvent, this._onServerEvent.bind(this, ep));
}
_unregisterHandlers() {
this.removeCustomEventListeners();
}
_onError(_ep, evt) {
this.logger.info({evt}, 'TaskLlmVoiceAgent_S2S:_onError');
this.notifyTaskDone();
}
_onConnect(ep) {
this.logger.debug('TaskLlmVoiceAgent_S2S:_onConnect');
this._sendInitialMessage(ep);
}
_onConnectFailure(_ep, evt) {
this.logger.info(evt, 'TaskLlmVoiceAgent_S2S:_onConnectFailure');
this.results = {completionReason: 'connection failure'};
this.notifyTaskDone();
}
_onDisconnect(_ep, evt) {
this.logger.info(evt, 'TaskLlmVoiceAgent_S2S:_onConnectFailure');
this.results = {completionReason: 'disconnect from remote end'};
this.notifyTaskDone();
}
async _onServerEvent(_ep, evt) {
let endConversation = false;
const type = evt.type;
this.logger.info({evt}, 'TaskLlmVoiceAgent_S2S:_onServerEvent');
/* check for failures, such as rate limit exceeded, that should terminate the conversation */
if (type === 'response.done' && evt.response.status === 'failed') {
endConversation = true;
this.results = {
completionReason: 'server failure',
error: evt.response.status_details?.error
};
}
/* server errors of some sort */
else if (type === 'error') {
endConversation = true;
this.results = {
completionReason: 'server error',
error: evt.error
};
}
/* tool calls */
else if (type === 'FunctionCallRequest') {
this.logger.debug({evt}, 'TaskLlmVoiceAgent_S2S:_onServerEvent - function_call');
const mcpTools = this.parent.isMcpEnabled ? await this.parent.mcpService.getAvailableMcpTools() : [];
if (!this.toolHook && mcpTools.length === 0) {
this.logger.warn({evt}, 'TaskLlmVoiceAgent_S2S:_onServerEvent - no toolHook defined!');
} else {
const {functions} = evt;
const handledFunctions = [];
try {
if (mcpTools && mcpTools.length > 0) {
for (const func of functions) {
const {name, arguments: args, id} = func;
const tool = mcpTools.find((tool) => tool.name === name);
if (tool) {
handledFunctions.push(name);
const response = await this.parent.mcpService.callMcpTool(name, JSON.parse(args));
this.logger.debug({response}, 'TaskLlmVoiceAgent_S2S:_onServerEvent - function_call - mcp result');
this.processToolOutput(_ep, id, {
data: {
type: 'FunctionCallResponse',
id,
name,
content: response.length > 0 ? response[0].text : 'There is no output from the function call'
}
});
}
}
}
for (const func of functions) {
const {name, arguments: args, id} = func;
if (!handledFunctions.includes(name)) {
await this.parent.sendToolHook(id, {name, args: JSON.parse(args)});
}
}
} catch (err) {
this.logger.info({err, evt}, 'TaskLlmVoiceAgent_S2S:_onServerEvent - error calling function');
this.results = {
completionReason: 'client error calling function',
error: err
};
endConversation = true;
}
}
}
/* check whether we should notify on this event */
if (this.includeEvents.length > 0 ? this.includeEvents.includes(type) : !this.excludeEvents.includes(type)) {
this.parent.sendEventHook(evt)
.catch((err) => this.logger.info({err}, 'TaskLlmVoiceAgent_S2S:_onServerEvent - error sending event hook'));
}
if (endConversation) {
this.logger.info({results: this.results},
'TaskLlmVoiceAgent_S2S:_onServerEvent - ending conversation due to error');
this.notifyTaskDone();
}
}
_populateEvents(events) {
if (events.includes('all')) {
/* work by excluding specific events */
const exclude = events
.filter((evt) => evt.startsWith('-'))
.map((evt) => evt.slice(1));
if (exclude.length === 0) this.includeEvents = va_server_events;
else this.excludeEvents = expandWildcards(exclude);
}
else {
/* work by including specific events */
const include = events
.filter((evt) => !evt.startsWith('-'));
this.includeEvents = expandWildcards(include);
}
this.logger.debug({
includeEvents: this.includeEvents,
excludeEvents: this.excludeEvents
}, 'TaskLlmVoiceAgent_S2S:_populateEvents');
}
}
module.exports = TaskLlmVoiceAgent_S2S;

View File

@@ -62,6 +62,9 @@ function makeTask(logger, obj, parent) {
case TaskName.Message:
const TaskMessage = require('./message');
return new TaskMessage(logger, data, parent);
case TaskName.Llm:
const TaskLlm = require('./llm');
return new TaskLlm(logger, data, parent);
case TaskName.Rasa:
const TaskRasa = require('./rasa');
return new TaskRasa(logger, data, parent);
@@ -81,6 +84,7 @@ function makeTask(logger, obj, parent) {
const TaskTranscribe = require('./transcribe');
return new TaskTranscribe(logger, data, parent);
case TaskName.Listen:
case TaskName.Stream:
const TaskListen = require('./listen');
return new TaskListen(logger, data, parent);
case TaskName.Redirect:
@@ -92,6 +96,9 @@ function makeTask(logger, obj, parent) {
case TaskName.Tag:
const TaskTag = require('./tag');
return new TaskTag(logger, data, parent);
case TaskName.Alert:
const TaskAlert = require('./alert');
return new TaskAlert(logger, data, parent);
}
// should never reach

View File

@@ -1,7 +1,7 @@
const Task = require('./task');
const {TaskName, TaskPreconditions} = require('../utils/constants');
const bent = require('bent');
const uuidv4 = require('uuid-random');
const crypto = require('crypto');
const {K8S} = require('../config');
class TaskMessage extends Task {
constructor(logger, opts) {
@@ -9,7 +9,7 @@ class TaskMessage extends Task {
this.preconditions = TaskPreconditions.None;
this.payload = {
message_sid: this.data.message_sid || uuidv4(),
message_sid: this.data.message_sid || crypto.randomUUID(),
carrier: this.data.carrier,
to: this.data.to,
from: this.data.from,

View File

@@ -1,12 +1,26 @@
const Task = require('./task');
const {TaskName, TaskPreconditions} = require('../utils/constants');
const { PlayFileNotFoundError } = require('../utils/error');
class TaskPlay extends Task {
constructor(logger, opts) {
super(logger, opts);
this.preconditions = TaskPreconditions.Endpoint;
this.url = this.data.url;
//Cleanup URLs that contain a querystring with a . unless that querystring is the filename
// see https://github.com/jambonz/jambonz-feature-server/pull/1293
// and https://github.com/jambonz/jambonz-feature-server/issues/1394 for background
if (this.data.url.includes('?')) {
if (['.mp3', '.wav'].includes(this.data.url.slice(-4))) {
this.url = this.data.url;
}
else {
this.url = this.data.url.split('?')[0] + '?' + this.data.url.split('?')[1].replaceAll('.', '%2E');
}
}
else {
this.url = this.data.url;
}
this.seekOffset = this.data.seekOffset || -1;
this.timeoutSecs = this.data.timeoutSecs || -1;
this.loop = this.data.loop || 1;
@@ -26,6 +40,7 @@ class TaskPlay extends Task {
let playbackSeconds = 0;
let playbackMilliseconds = 0;
let completed = !(this.timeoutSecs > 0 || this.loop);
cs.playingAudio = true;
if (this.timeoutSecs > 0) {
timeout = setTimeout(async() => {
completed = true;
@@ -39,6 +54,22 @@ class TaskPlay extends Task {
try {
this.notifyStatus({event: 'start-playback'});
while (!this.killed && (this.loop === 'forever' || this.loop--) && this.ep.connected) {
/* Listen for playback-start event and set up a one-time listener for uuid_break
* that will kill the audio playback if the taskIds match. This ensures that
* we only kill the currently playing audio and not audio from other tasks.
* As we are using stickyEventEmitter, even if the event is emitted before the listener is registered,
* the listener will receive the most recent event.
*/
ep.once('playback-start', (evt) => {
this.logger.debug({evt}, 'Play got playback-start');
this.cs.stickyEventEmitter?.once('uuid_break', (t) => {
if (t?.taskId === this.taskId) {
this.logger.debug(`Play got kill-playback, executing uuid_break, taskId: ${t?.taskId}`);
this.ep.api('uuid_break', this.ep.uuid).catch((err) => this.logger.info(err, 'Error killing audio'));
this.notifyStatus({event: 'kill-playback'});
}
});
});
if (cs.isInConference) {
const {memberId, confName, confUuid} = cs;
if (Array.isArray(this.url)) {
@@ -66,23 +97,35 @@ class TaskPlay extends Task {
}
}
} catch (err) {
if (timeout) clearTimeout(timeout);
this.logger.info(err, `TaskPlay:exec - error playing ${this.url}`);
this.logger.info(`TaskPlay:exec - error playing ${this.url}: ${err.message}`);
this.playComplete = true;
if (err.message === 'File Not Found') {
const {writeAlerts, AlertType} = cs.srf.locals;
await this.performAction({status: 'fail', reason: 'playFailed'}, !(this.parentTask || cs.isConfirmCallSession));
this.emit('playDone');
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.PLAY_FILENOTFOUND,
url: this.url,
target_sid: cs.callSid
});
throw new PlayFileNotFoundError(this.url);
}
}
this.emit('playDone');
}
async kill(cs) {
super.kill(cs);
if (this.ep.connected && !this.playComplete) {
if (this.ep?.connected && !this.playComplete) {
this.logger.debug('TaskPlay:kill - killing audio');
if (cs.isInConference) {
const {memberId, confName} = cs;
this.killPlayToConfMember(this.ep, memberId, confName);
}
else {
this.notifyStatus({event: 'kill-playback'});
this.ep.api('uuid_break', this.ep.uuid).catch((err) => this.logger.info(err, 'Error killing audio'));
//this.ep.api('uuid_break', this.ep.uuid).catch((err) => this.logger.info(err, 'Error killing audio'));
cs.stickyEventEmitter.emit('uuid_break', this);
}
}
}

View File

@@ -1,5 +1,8 @@
const Task = require('./task');
const {TaskName} = require('../utils/constants');
const WsRequestor = require('../utils/ws-requestor');
const URL = require('url');
const HttpRequestor = require('../utils/http-requestor');
/**
* Redirects to a new application
@@ -13,6 +16,37 @@ class TaskRedirect extends Task {
async exec(cs) {
await super.exec(cs);
const isAbsoluteUrl = cs.application?.requestor?._isAbsoluteUrl(this.actionHook);
if (isAbsoluteUrl) {
this.logger.info(`TaskRedirect redirecting to new absolute URL ${this.actionHook}, requires new requestor`);
if (cs.requestor instanceof WsRequestor) {
try {
const requestor = new WsRequestor(this.logger, cs.accountSid, {url: this.actionHook},
cs.accountInfo.account.webhook_secret) ;
cs.requestor.emit('handover', requestor);
} catch (err) {
this.logger.info(err, `TaskRedirect error redirecting to ${this.actionHook}`);
}
}
else {
const baseUrl = this.cs.application.requestor.baseUrl;
const newUrl = URL.parse(this.actionHook);
const newBaseUrl = newUrl.protocol + '//' + newUrl.host;
if (baseUrl != newBaseUrl) {
try {
this.logger.info(`Task:redirect updating base url to ${newBaseUrl}`);
const newRequestor = new HttpRequestor(this.logger, cs.accountSid, {url: this.actionHook},
cs.accountInfo.account.webhook_secret);
cs.requestor.emit('handover', newRequestor);
} catch (err) {
this.logger.info(err, `TaskRedirect error updating base url to ${this.actionHook}`);
}
}
}
}
await this.performAction();
}
}

View File

@@ -12,12 +12,14 @@ class TaskRestDial extends Task {
this.from = this.data.from;
this.callerName = this.data.callerName;
this.timeLimit = this.data.timeLimit;
this.fromHost = this.data.fromHost;
this.to = this.data.to;
this.call_hook = this.data.call_hook;
this.timeout = this.data.timeout || 60;
this.sipRequestWithinDialogHook = this.data.sipRequestWithinDialogHook;
this.referHook = this.data.referHook;
this.recentCallStatus = 0;
this.on('connect', this._onConnect.bind(this));
this.on('callStatus', this._onCallStatus.bind(this));
@@ -39,9 +41,9 @@ class TaskRestDial extends Task {
if (this.data.amd) {
this.startAmd = cs.startAmd;
this.stopAmd = cs.stopAmd;
this.on('amd', this._onAmdEvent.bind(this, cs));
}
this.stopAmd = cs.stopAmd;
this._setCallTimer();
await this.awaitTaskDone();
@@ -56,7 +58,11 @@ class TaskRestDial extends Task {
this._clearCallTimer();
if (this.canCancel) {
this.canCancel = false;
cs?.req?.cancel();
try {
cs?.req?.cancel();
} catch (err) {
this.logger.error({err}, 'TaskRestDial: error cancelling call');
}
}
this.notifyTaskDone();
}
@@ -66,6 +72,9 @@ class TaskRestDial extends Task {
const cs = this.callSession;
cs.setDialog(dlg);
cs.referHook = this.referHook;
if (this.timeLimit) {
cs.startMaxCallDurationTimer(this.timeLimit);
}
this.logger.debug('TaskRestDial:_onConnect - call connected');
if (this.sipRequestWithinDialogHook) this._initSipRequestWithinDialogHandler(cs, dlg);
try {
@@ -73,15 +82,18 @@ class TaskRestDial extends Task {
const httpHeaders = b3 && {b3};
const params = {
...(cs.callInfo.toJSON()),
...(this.env_vars && {env_vars: this.env_vars}),
defaults: {
synthesizer: {
vendor: cs.speechSynthesisVendor,
language: cs.speechSynthesisLanguage,
voice: cs.speechSynthesisVoice
voice: cs.speechSynthesisVoice,
label: cs.speechSynthesisLabel,
},
recognizer: {
vendor: cs.speechRecognizerVendor,
language: cs.speechRecognizerLanguage
language: cs.speechRecognizerLanguage,
label: cs.speechRecognizerLabel,
}
}
};
@@ -111,7 +123,8 @@ class TaskRestDial extends Task {
}
_onCallStatus(status) {
this.logger.debug(`CallStatus: ${status}`);
this.logger.debug(`RestDial CallStatus: ${status}`);
this.recentCallStatus = status;
if (status >= 200) {
this.canCancel = false;
this._clearCallTimer();
@@ -129,11 +142,16 @@ class TaskRestDial extends Task {
}
_onCallTimeout() {
this.logger.debug('TaskRestDial: timeout expired without answer, killing task');
this.logger.debug(`TaskRestDial: timeout expired without answer, last status ${this.recentCallStatus}`);
this.timer = null;
if (this.canCancel) {
if (this.canCancel && this.recentCallStatus < 200) {
this.logger.debug('TaskRestDial: cancelling call attempt');
this.canCancel = false;
this.cs?.req?.cancel();
try {
this.cs?.req?.cancel();
} catch (err) {
this.logger.error({err}, 'TaskRestDial: error cancelling call');
}
}
}

View File

@@ -1,24 +1,57 @@
const assert = require('assert');
const TtsTask = require('./tts-task');
const {TaskName, TaskPreconditions} = require('../utils/constants');
const pollySSMLSplit = require('polly-ssml-split');
const { SpeechCredentialError, NonFatalTaskError } = require('../utils/error');
const { sleepFor } = require('../utils/helpers');
const { NON_FANTAL_ERRORS } = require('../utils/constants.json');
const breakLengthyTextIfNeeded = (logger, text) => {
const chunkSize = 1000;
/**
* Discard unmatching responses:
* (1) I sent a playback id but get a response with a different playback id
* (2) I sent a playback id but get a response with no playback id
* (3) I did not send a playback id but get a response with a playback id
* (4) I sent a cache file but get a response with a different cache file
*/
const isMatchingEvent = (logger, filename, playbackId, evt) => {
if (!!playbackId && !!evt.variable_tts_playback_id && evt.variable_tts_playback_id === playbackId) {
//logger.debug({filename, playbackId, evt}, 'Say:isMatchingEvent - playbackId matched');
return true;
}
if (!!filename && !!evt.file && evt.file === filename) {
//logger.debug({filename, playbackId, evt}, 'Say:isMatchingEvent - filename matched');
return true;
}
logger.info({filename, playbackId, evt}, 'Say:isMatchingEvent - no match');
return false;
};
const breakLengthyTextIfNeeded = (logger, text) => {
// As The text can be used for tts streaming, we need to break lengthy text into smaller chunks
// HIGH_WATER_BUFFER_SIZE defined in tts-streaming-buffer.js
const chunkSize = 900;
const isSSML = text.startsWith('<speak>');
if (text.length <= chunkSize || !isSSML) return [text];
const options = {
// MIN length
softLimit: 100,
// MAX length, exclude 15 characters <speak></speak>
hardLimit: chunkSize - 15,
// Set of extra split characters (Optional property)
extraSplitChars: ',;!?',
};
pollySSMLSplit.configure(options);
try {
return pollySSMLSplit.split(text);
if (text.length <= chunkSize) return [text];
if (isSSML) {
return pollySSMLSplit.split(text);
} else {
// Wrap with <speak> and split
const wrapped = `<speak>${text}</speak>`;
const splitArr = pollySSMLSplit.split(wrapped);
// Remove <speak> and </speak> from each chunk
return splitArr.map((str) => str.replace(/^<speak>/, '').replace(/<\/speak>$/, ''));
}
} catch (err) {
logger.info({err}, 'Error spliting SSML long text');
logger.info({err}, 'Error splitting SSML long text');
return [text];
}
};
@@ -34,24 +67,39 @@ class TaskSay extends TtsTask {
super(logger, opts, parentTask);
this.preconditions = TaskPreconditions.Endpoint;
this.text = (Array.isArray(this.data.text) ? this.data.text : [this.data.text])
.map((t) => breakLengthyTextIfNeeded(this.logger, t))
.flat();
assert.ok((typeof this.data.text === 'string' || Array.isArray(this.data.text)) || this.data.stream === true,
'Say: either text or stream:true is required');
this.loop = this.data.loop || 1;
this.isHandledByPrimaryProvider = true;
this.text = this.data.text ? (Array.isArray(this.data.text) ? this.data.text : [this.data.text])
.map((t) => breakLengthyTextIfNeeded(this.logger, t))
.flat() : [];
if (this.data.stream === true) {
this._isStreamingTts = true;
this.closeOnStreamEmpty = this.data.closeOnStreamEmpty !== false;
}
else {
this._isStreamingTts = false;
this.loop = this.data.loop || 1;
this.isHandledByPrimaryProvider = true;
}
}
get name() { return TaskName.Say; }
get summary() {
for (let i = 0; i < this.text.length; i++) {
if (this.text[i].startsWith('silence_stream')) continue;
return `${this.name}{text=${this.text[i].slice(0, 15)}${this.text[i].length > 15 ? '...' : ''}}`;
if (this.isStreamingTts) return `${this.name} streaming`;
else {
for (let i = 0; i < this.text.length; i++) {
if (this.text[i].startsWith('silence_stream')) continue;
return `${this.name}{text=${this.text[i].slice(0, 15)}${this.text[i].length > 15 ? '...' : ''}}`;
}
return `${this.name}{${this.text[0]}}`;
}
return `${this.name}{${this.text[0]}}`;
}
get isStreamingTts() { return this._isStreamingTts; }
_validateURL(urlString) {
try {
new URL(urlString);
@@ -61,147 +109,105 @@ class TaskSay extends TtsTask {
}
}
async _synthesizeWithSpecificVendor(cs, ep, {vendor, language, voice, label, preCache = false}) {
const {srf, accountSid:account_sid} = cs;
const {updateSpeechCredentialLastUsed} = require('../utils/db-utils')(this.logger, srf);
const {writeAlerts, AlertType, stats} = srf.locals;
const {synthAudio} = srf.locals.dbHelpers;
const engine = this.synthesizer.engine || cs.synthesizer?.engine || 'neural';
const salt = cs.callSid;
let credentials = cs.getSpeechCredentials(vendor, 'tts', label);
/* parse Nuance voices into name and model */
let model;
if (vendor === 'nuance' && voice) {
const arr = /([A-Za-z-]*)\s+-\s+(enhanced|standard)/.exec(voice);
if (arr) {
voice = arr[1];
model = arr[2];
}
} else if (vendor === 'deepgram') {
model = voice;
async exec(cs, obj) {
if (this.isStreamingTts && !cs.appIsUsingWebsockets) {
throw new Error('Say: streaming say verb requires applications to use the websocket API');
}
/* allow for microsoft custom region voice and api_key to be specified as an override */
if (vendor === 'microsoft' && this.options.deploymentId) {
credentials = credentials || {};
credentials.use_custom_tts = true;
credentials.custom_tts_endpoint = this.options.deploymentId;
credentials.api_key = this.options.apiKey || credentials.apiKey;
credentials.region = this.options.region || credentials.region;
voice = this.options.voice || voice;
} else if (vendor === 'elevenlabs') {
credentials = credentials || {};
credentials.model_id = this.options.model_id || credentials.model_id;
credentials.voice_settings = this.options.voice_settings || {};
credentials.optimize_streaming_latency = this.options.optimize_streaming_latency
|| credentials.optimize_streaming_latency;
voice = this.options.voice_id || voice;
}
ep.set({
tts_engine: vendor.startsWith('custom:') ? 'custom' : vendor,
tts_voice: voice,
cache_speech_handles: !cs.currentTtsVendor || cs.currentTtsVendor === vendor ? 1 : 0,
}).catch((err) => this.logger.info({err}, 'Error setting tts_engine on endpoint'));
// set the current vendor on the call session
// If vendor is changed from the previous one, then reset the cache_speech_handles flag
cs.currentTtsVendor = vendor;
if (!preCache && !this._disableTracing) this.logger.info({vendor, language, voice, model}, 'TaskSay:exec');
try {
if (!credentials) {
writeAlerts({
account_sid,
alert_type: AlertType.TTS_NOT_PROVISIONED,
vendor
}).catch((err) => this.logger.info({err}, 'Error generating alert for no tts'));
throw new Error('no provisioned speech credentials for TTS');
this._isStreamingTts = this._isStreamingTts || cs.autoStreamTts;
if (this.isStreamingTts) {
this.closeOnStreamEmpty = this.closeOnStreamEmpty || this.text.length !== 0;
}
// synthesize all of the text elements
let lastUpdated = false;
/* produce an audio segment from the provided text */
const generateAudio = async(text) => {
if (this.killed) return;
if (text.startsWith('silence_stream://')) return text;
/* otel: trace time for tts */
if (!preCache && !this._disableTracing) {
const {span} = this.startChildSpan('tts-generation', {
'tts.vendor': vendor,
'tts.language': language,
'tts.voice': voice
});
this.otelSpan = span;
}
try {
const {filePath, servedFromCache, rtt} = await synthAudio(stats, {
account_sid,
text,
vendor,
language,
voice,
engine,
model,
salt,
credentials,
options: this.options,
disableTtsCache : this.disableTtsCache,
preCache
});
if (!filePath.startsWith('say:')) {
this.logger.debug(`Say: file ${filePath}, served from cache ${servedFromCache}`);
if (filePath) cs.trackTmpFile(filePath);
if (this.otelSpan) {
this.otelSpan.setAttributes({'tts.cached': servedFromCache});
this.otelSpan.end();
this.otelSpan = null;
}
if (!servedFromCache && !lastUpdated) {
lastUpdated = true;
updateSpeechCredentialLastUsed(credentials.speech_credential_sid).catch(() => {/* logged error */});
}
if (!servedFromCache && rtt && !preCache && !this._disableTracing) {
this.notifyStatus({
event: 'synthesized-audio',
vendor,
language,
characters: text.length,
elapsedTime: rtt
});
}
}
else {
this.logger.debug('Say: a streaming tts api will be used');
const modifiedPath = filePath.replace('say:{', `say:{session-uuid=${ep.uuid},`);
return modifiedPath;
}
return filePath;
} catch (err) {
this.logger.info({err}, 'Error synthesizing tts');
if (this.otelSpan) this.otelSpan.end();
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.TTS_FAILURE,
vendor,
detail: err.message
}).catch((err) => this.logger.info({err}, 'Error generating alert for tts failure'));
throw err;
}
};
const arr = this.text.map((t) => (this._validateURL(t) ? t : generateAudio(t)));
return (await Promise.all(arr)).filter((fp) => fp && fp.length);
} catch (err) {
this.logger.info(err, 'TaskSay:exec error');
throw err;
if (this.isStreamingTts) await this.handlingStreaming(cs, obj);
else await this.handling(cs, obj);
} catch (error) {
if (error instanceof SpeechCredentialError) {
// if say failed due to speech credentials, alarm is writtern and error notification is sent
// finished this say to move to next task.
this.logger.info({error}, 'Say failed due to SpeechCredentialError, finished!');
return;
}
throw error;
}
}
async exec(cs, {ep}) {
const {srf, accountSid:account_sid} = cs;
async handlingStreaming(cs, {ep}) {
const {vendor, language, voice, label} = this.getTtsVendorData(cs);
const credentials = cs.getSpeechCredentials(vendor, 'tts', label);
if (!credentials) {
throw new SpeechCredentialError(
`No text-to-speech service credentials for ${vendor} with labels: ${label} have been configured`);
}
this.ep = ep;
try {
await this.setTtsStreamingChannelVars(vendor, language, voice, credentials, ep);
await cs.startTtsStream();
if (this.text.length !== 0) {
this.logger.info('TaskSay:handlingStreaming - sending text to TTS stream');
for (const t of this.text) {
const result = await cs._internalTtsStreamingBufferTokens(t);
if (result?.status === 'failed') {
if (result.reason === 'full') {
// Retry logic for full buffer
const maxRetries = 5;
let backoffMs = 1000;
for (let retryCount = 0; retryCount < maxRetries && !this.killed; retryCount++) {
this.logger.info(
`TaskSay:handlingStreaming - retry ${retryCount + 1}/${maxRetries} after ${backoffMs}ms`);
await sleepFor(backoffMs);
const retryResult = await cs._internalTtsStreamingBufferTokens(t);
// Exit retry loop on success
if (retryResult?.status !== 'failed') {
break;
}
// Handle failure for reason other than full buffer
if (retryResult.reason !== 'full') {
this.logger.info(
{result: retryResult}, 'TaskSay:handlingStreaming - TTS stream failed to buffer tokens');
throw new Error(`TTS stream failed to buffer tokens: ${retryResult.reason}`);
}
// Last retry attempt failed
if (retryCount === maxRetries - 1) {
this.logger.info('TaskSay:handlingStreaming - Maximum retries exceeded for full buffer');
throw new Error('TTS stream buffer full - maximum retries exceeded');
}
// Increase backoff for next retry
backoffMs = Math.min(backoffMs * 1.5, 10000);
}
} else {
// Immediate failure for non-full buffer issues
this.logger.info({result}, 'TaskSay:handlingStreaming - TTS stream failed to buffer tokens');
throw new Error(`TTS stream failed to buffer tokens: ${result.reason}`);
}
} else {
await cs._lccTtsFlush();
}
}
}
} catch (err) {
this.logger.info({err}, 'TaskSay:handlingStreaming - Error setting channel vars');
cs.requestor?.request('tts:streaming-event', '/streaming-event', {event_type: 'stream_closed'})
.catch((err) => this.logger.info({err}, 'TaskSay:handlingStreaming - Error sending'));
//TODO: send tts:streaming-event with error?
this.notifyTaskDone();
}
await this.awaitTaskDone();
this.logger.info('TaskSay:handlingStreaming - done');
}
async handling(cs, {ep}) {
const {srf, accountSid:account_sid, callSid:target_sid} = cs;
const {writeAlerts, AlertType} = srf.locals;
const {addFileToCache} = srf.locals.dbHelpers;
const engine = this.synthesizer.engine || cs.synthesizer?.engine || 'neural';
@@ -218,10 +224,7 @@ class TaskSay extends TtsTask {
let voice = this.synthesizer.voice && this.synthesizer.voice !== 'default' ?
this.synthesizer.voice :
cs.speechSynthesisVoice;
// label can be null/empty in synthesizer config, just use application level label if it's default
let label = this.synthesizer.label === 'default' ?
cs.speechSynthesisLabel :
this.synthesizer.label;
let label = this.taskIncludeSynthesizer ? this.synthesizer.label : cs.speechSynthesisLabel;
const fallbackVendor = this.synthesizer.fallbackVendor && this.synthesizer.fallbackVendor !== 'default' ?
this.synthesizer.fallbackVendor :
@@ -232,10 +235,8 @@ class TaskSay extends TtsTask {
const fallbackVoice = this.synthesizer.fallbackVoice && this.synthesizer.fallbackVoice !== 'default' ?
this.synthesizer.fallbackVoice :
cs.fallbackSpeechSynthesisVoice;
// label can be null/empty in synthesizer config, just use application level label if it's default
const fallbackLabel = this.synthesizer.fallbackLabel === 'default' ?
cs.fallbackSpeechSynthesisLabel :
this.synthesizer.fallbackLabel;
const fallbackLabel = this.taskIncludeSynthesizer ?
this.synthesizer.fallbackLabel : cs.fallbackSpeechSynthesisLabel;
if (cs.hasFallbackTts) {
vendor = fallbackVendor;
@@ -261,9 +262,10 @@ class TaskSay extends TtsTask {
} else {
this.notifyError(
{ msg: 'TTS error', details:`TTS vendor ${vendor} error: ${error}`, failover: 'not available'});
throw error;
throw new SpeechCredentialError(error.message);
}
};
let filepath;
try {
filepath = await this._synthesizeWithSpecificVendor(cs, ep, {vendor, language, voice, label});
@@ -275,69 +277,145 @@ class TaskSay extends TtsTask {
while (!this.killed && (this.loop === 'forever' || this.loop--) && ep?.connected) {
let segment = 0;
while (!this.killed && segment < filepath.length) {
const filename = filepath[segment];
if (cs.isInConference) {
const {memberId, confName, confUuid} = cs;
await this.playToConfMember(ep, memberId, confName, confUuid, filepath[segment]);
await this.playToConfMember(ep, memberId, confName, confUuid, filename);
}
else {
let tts_cache_filename;
if (filepath[segment].startsWith('say:{')) {
const arr = /^say:\{.*\}\s*(.*)$/.exec(filepath[segment]);
if (arr) this.logger.debug(`Say:exec sending streaming tts request: ${arr[1].substring(0, 64)}..`);
const isStreaming = filename.startsWith('say:{');
if (isStreaming) {
const arr = /^say:\{.*\}\s*(.*)$/.exec(filename);
if (arr) this.logger.debug(`Say:exec sending streaming tts request ${arr[1].substring(0, 64)}..`);
else this.logger.debug(`Say:exec sending ${filename.substring(0, 64)}`);
}
else this.logger.debug(`Say:exec sending ${filepath[segment].substring(0, 64)}`);
ep.once('playback-start', (evt) => {
this.logger.debug({evt}, 'Say got playback-start');
if (this.otelSpan) {
this._addStreamingTtsAttributes(this.otelSpan, evt);
this.otelSpan.end();
this.otelSpan = null;
if (evt.variable_tts_cache_filename) {
tts_cache_filename = evt.variable_tts_cache_filename;
cs.trackTmpFile(evt.variable_tts_cache_filename);
const onPlaybackStop = (evt) => {
try {
const playbackId = this.getPlaybackId(segment);
const isMatch = isMatchingEvent(this.logger, filename, playbackId, evt);
if (!isMatch) {
this.logger.info({currentPlaybackId: playbackId, stopPlaybackId: evt.variable_tts_playback_id},
'Say:exec discarding playback-stop for earlier play');
ep.once('playback-stop', this._boundOnPlaybackStop);
return;
}
else {
this.logger.info('No tts_cache_filename in playback-start event');
this.logger.debug({evt},
`Say got playback-stop ${evt.variable_tts_playback_id ? evt.variable_tts_playback_id : ''}`);
this.notifyStatus({event: 'stop-playback'});
this.notifiedPlayBackStop = true;
const tts_error = evt.variable_tts_error;
// some tts vendor may not provide response code, so we default to 200
let response_code = 200;
// Check if any property ends with _response_code
for (const [key, value] of Object.entries(evt)) {
if (key.endsWith('_response_code')) {
response_code = parseInt(value, 10);
if (isNaN(response_code)) {
this.logger.info(`Say:exec playback-stop - Invalid response code: ${value}`);
response_code = 0;
}
break;
}
}
}
});
ep.once('playback-stop', (evt) => {
if (!tts_cache_filename || evt.variable_tts_cache_filename !== tts_cache_filename) {
this.logger.info({evt}, 'Say: discarding playback-stop from other say verb');
}
else {
this.logger.debug({evt}, 'Say got playback-stop');
if (evt.variable_tts_error) {
if (tts_error ||
// error response codes indicate failure
response_code <= 199 || response_code >= 300) {
writeAlerts({
account_sid,
alert_type: AlertType.TTS_FAILURE,
vendor,
detail: evt.variable_tts_error
detail: evt.variable_tts_error || `TTS playback failed with response code ${response_code}`,
target_sid
}).catch((err) => this.logger.info({err}, 'Error generating alert for no tts'));
}
if (evt.variable_tts_cache_filename && !this.killed) {
if (
!tts_error &&
//2xx response codes indicate success
199 < response_code && response_code < 300 &&
evt.variable_tts_cache_filename &&
!this.killed &&
// if tts cache is not disabled, add the file to cache
!this.disableTtsCache
) {
const text = parseTextFromSayString(this.text[segment]);
this.logger.debug({text, cacheFile: evt.variable_tts_cache_filename}, 'Say:exec cache tts');
addFileToCache(evt.variable_tts_cache_filename, {
account_sid,
vendor,
language,
voice,
engine,
text
model: this.model || this.model_id,
text,
instructions: this.instructions
}).catch((err) => this.logger.info({err}, 'Error adding file to cache'));
}
if (this._playResolve) {
(tts_error ||
// error response codes indicate failure
response_code <= 199 || response_code >= 300
) ?
this._playReject(
new Error(evt.variable_tts_error || `TTS playback failed with response code ${response_code}`)
) : this._playResolve();
}
} catch (err) {
this.logger.info({err}, 'Error handling playback-stop event');
}
if (this._playResolve) {
evt.variable_tts_error ? this._playReject(new Error(evt.variable_tts_error)) : this._playResolve();
};
this._boundOnPlaybackStop = onPlaybackStop.bind(this);
const onPlaybackStart = (evt) => {
try {
const playbackId = this.getPlaybackId(segment);
const isMatch = isMatchingEvent(this.logger, filename, playbackId, evt);
if (!isMatch) {
this.logger.info({currentPlaybackId: playbackId, startPlaybackId: evt.variable_tts_playback_id},
'Say:exec playback-start - unmatched playback_id');
ep.once('playback-start', this._boundOnPlaybackStart);
return;
}
ep.once('playback-stop', this._boundOnPlaybackStop);
this.logger.debug({evt},
`Say got playback-start ${evt.variable_tts_playback_id ? evt.variable_tts_playback_id : ''}`);
if (this.otelSpan) {
this._addStreamingTtsAttributes(this.otelSpan, evt, vendor);
this.otelSpan.end();
this.otelSpan = null;
if (evt.variable_tts_cache_filename) {
cs.trackTmpFile(evt.variable_tts_cache_filename);
}
}
} catch (err) {
this.logger.info({err}, 'Error handling playback-start event');
}
});
};
this._boundOnPlaybackStart = onPlaybackStart.bind(this);
ep.once('playback-start', this._boundOnPlaybackStart);
// wait for playback-stop event received to confirm if the playback is successful
this._playPromise = new Promise((resolve, reject) => {
this._playResolve = resolve;
this._playReject = reject;
});
const r = await ep.play(filepath[segment]);
this.logger.debug({r}, 'Say:exec play result');
try {
const r = await ep.play(filename);
this.logger.debug({r}, 'Say:exec play result');
if (r.playbackSeconds == null && r.playbackMilliseconds == null && r.playbackLastOffsetPos == null) {
this._playReject(new Error('Playback failed to start'));
}
} catch (err) {
if (NON_FANTAL_ERRORS.includes(err.message)) {
throw new NonFatalTaskError(err.message);
}
throw err;
}
try {
// wait for playback-stop event received to confirm if the playback is successful
await this._playPromise;
@@ -347,24 +425,24 @@ class TaskSay extends TtsTask {
continue;
} catch (err) {
this.logger.info({err}, 'Error waiting for playback-stop event');
throw err;
}
} finally {
this._playPromise = null;
this._playResolve = null;
this._playReject = null;
}
if (filepath[segment].startsWith('say:{')) {
const arr = /^say:\{.*\}\s*(.*)$/.exec(filepath[segment]);
if (filename.startsWith('say:{')) {
const arr = /^say:\{.*\}\s*(.*)$/.exec(filename);
if (arr) this.logger.debug(`Say:exec complete playing streaming tts request: ${arr[1].substring(0, 64)}..`);
} else {
// This log will print spech credentials in say command for tts stream mode
this.logger.debug(`Say:exec completed play file ${filepath[segment]}`);
this.logger.debug(`Say:exec completed play file ${filename}`);
}
}
segment++;
}
}
this.emit('playDone');
}
async kill(cs) {
@@ -374,8 +452,13 @@ class TaskSay extends TtsTask {
if (cs.isInConference) {
const {memberId, confName} = cs;
this.killPlayToConfMember(this.ep, memberId, confName);
}
else {
} else if (this.isStreamingTts) {
this.logger.debug('TaskSay:kill - stopping TTS stream for streaming audio');
cs.stopTtsStream();
} else {
if (!this.notifiedPlayBackStop) {
this.notifyStatus({event: 'stop-playback'});
}
this.notifyStatus({event: 'kill-playback'});
this.ep.api('uuid_break', this.ep.uuid);
}
@@ -387,26 +470,41 @@ class TaskSay extends TtsTask {
this._playResolve = null;
}
}
this.notifyTaskDone();
}
_addStreamingTtsAttributes(span, evt) {
_addStreamingTtsAttributes(span, evt, vendor) {
const attrs = {'tts.cached': false};
for (const [key, value] of Object.entries(evt)) {
if (key.startsWith('variable_tts_')) {
let newKey = key.substring('variable_tts_'.length)
.replace('whisper_', 'whisper.')
.replace('nvidia_', 'nvidia.')
.replace('deepgram_', 'deepgram.')
.replace('playht_', 'playht.')
.replace('cartesia_', 'cartesia.')
.replace('rimelabs_', 'rimelabs.')
.replace('resemble_', 'resemble.')
.replace('inworld_', 'inworld.')
.replace('verbio_', 'verbio.')
.replace('elevenlabs_', 'elevenlabs.');
if (spanMapping[newKey]) newKey = spanMapping[newKey];
attrs[newKey] = value;
if (key === 'variable_tts_time_to_first_byte_ms' && value) {
this.cs.srf.locals.stats.histogram('tts.response_time', value, [`vendor:${vendor}`]);
}
}
}
delete attrs['cache_filename']; //no value in adding this to the span
span.setAttributes(attrs);
}
notifyTtsStreamIsEmpty() {
if (this.isStreamingTts && this.closeOnStreamEmpty) {
this.logger.info('TaskSay:notifyTtsStreamIsEmpty - stream is empty, killing task');
this.notifyTaskDone();
}
}
}
const spanMapping = {
@@ -443,10 +541,23 @@ const spanMapping = {
'playht.name_lookup_time_ms': 'name_lookup_ms',
'playht.connect_time_ms': 'connect_ms',
'playht.final_response_time_ms': 'final_response_ms',
// Cartesia
'cartesia.request_id': 'cartesia.req_id',
'cartesia.name_lookup_time_ms': 'name_lookup_ms',
'cartesia.connect_time_ms': 'connect_ms',
'cartesia.final_response_time_ms': 'final_response_ms',
// Rimelabs
'rimelabs.name_lookup_time_ms': 'name_lookup_ms',
'rimelabs.connect_time_ms': 'connect_ms',
'rimelabs.final_response_time_ms': 'final_response_ms',
// Resemble
'resemble.connect_time_ms': 'connect_ms',
'resemble.final_response_time_ms': 'final_response_ms',
// inworld
'inworld.name_lookup_time_ms': 'name_lookup_ms',
'inworld.connect_time_ms': 'connect_ms',
'inworld.final_response_time_ms': 'final_response_ms',
'inworld.x_envoy_upstream_service_time': 'upstream_service_time',
// verbio
'verbio.name_lookup_time_ms': 'name_lookup_ms',
'verbio.connect_time_ms': 'connect_ms',

View File

@@ -18,6 +18,11 @@ class TaskSipDecline extends Task {
super.exec(cs);
res.send(this.data.status, this.data.reason, {
headers: this.headers
}, (err) => {
if (!err) {
// Call was successfully declined
cs._callReleased();
}
});
cs.emit('callStatusChange', {
callStatus: CallStatus.Failed,

View File

@@ -12,6 +12,7 @@ class TaskSipRefer extends Task {
this.referTo = this.data.referTo;
this.referredBy = this.data.referredBy;
this.referredByDisplayName = this.data.referredByDisplayName;
this.headers = this.data.headers || {};
this.eventHook = this.data.eventHook;
}
@@ -94,7 +95,10 @@ class TaskSipRefer extends Task {
}
if (status >= 200) {
this.referSpan.setAttributes({'refer.finalNotify': status});
await this.performAction({refer_status: 202, final_referred_call_status: status});
await this.performAction({refer_status: 202, final_referred_call_status: status})
.catch((err) => {
this.logger.error(err, 'TaskSipRefer:exec - error performing action finalNotify');
});
this.notifyTaskDone();
}
}
@@ -102,12 +106,17 @@ class TaskSipRefer extends Task {
}
_normalizeReferHeaders(cs, dlg) {
let {referTo, referredBy} = this;
let {referTo, referredBy, referredByDisplayName} = this;
/* get IP address of the SBC to use as hostname if needed */
const {host} = parseUri(dlg.remote.uri);
if (!referTo.startsWith('<') && !referTo.startsWith('sip') && !referTo.startsWith('"')) {
if (
!referTo.startsWith('<') &&
!referTo.startsWith('sip:') &&
!referTo.startsWith('"') &&
!referTo.startsWith('tel:')
) {
/* they may have only provided a phone number/user */
referTo = `sip:${referTo}@${host}`;
}
@@ -117,9 +126,17 @@ class TaskSipRefer extends Task {
referredBy = cs.req?.callingNumber || dlg.local.uri;
this.logger.info({referredBy}, 'setting referredby');
}
if (!referredBy.startsWith('<') && !referredBy.startsWith('sip') && !referredBy.startsWith('"')) {
if (!referredByDisplayName) {
referredByDisplayName = cs.req?.callingName;
}
if (
!referredBy.startsWith('<') &&
!referredBy.startsWith('sip:') &&
!referredBy.startsWith('"') &&
!referredBy.startsWith('tel:')
) {
/* they may have only provided a phone number/user */
referredBy = `sip:${referredBy}@${host}`;
referredBy = `${referredByDisplayName ? `"${referredByDisplayName}"` : ''}<sip:${referredBy}@${host}>`;
}
return {referTo, referredBy};
}

View File

@@ -2,6 +2,33 @@ const Task = require('./task');
const assert = require('assert');
const crypto = require('crypto');
const { TaskPreconditions, CobaltTranscriptionEvents } = require('../utils/constants');
const { SpeechCredentialError } = require('../utils/error');
const {JAMBONES_AWS_TRANSCRIBE_USE_GRPC} = require('../config');
const {TaskName} = require('../utils/constants.json');
/**
* "Please insert turns here: {{turns:4}}"
// -> { processed: 'Please insert turns here: {{turns}}', turns: 4 }
processTurnString("Please insert turns here: {{turns}}"));
// -> { processed: 'Please insert turns here: {{turns}}', turns: null }
*/
const processTurnString = (input) => {
const regex = /\{\{turns(?::(\d+))?\}\}/;
const match = input.match(regex);
if (!match) {
return {
processed: input,
turns: null
};
}
const turns = match[1] ? parseInt(match[1], 10) : null;
const processed = input.replace(regex, '{{turns}}');
return { processed, turns };
};
class SttTask extends Task {
@@ -16,14 +43,22 @@ class SttTask extends Task {
normalizeTranscription,
setSpeechCredentialsAtRuntime,
compileSonioxTranscripts,
consolidateTranscripts
consolidateTranscripts,
updateSpeechmaticsPayload
} = require('../utils/transcription-utils')(logger);
this.setChannelVarsForStt = setChannelVarsForStt;
this.normalizeTranscription = normalizeTranscription;
this.compileSonioxTranscripts = compileSonioxTranscripts;
this.consolidateTranscripts = consolidateTranscripts;
this.updateSpeechmaticsPayload = updateSpeechmaticsPayload;
this.eventHandlers = [];
this.isHandledByPrimaryProvider = true;
/**
* Task use taskIncludeRecognizer to identify
* if taskIncludeRecognizer === true, use label from verb.recognizer, even it's empty
* if taskIncludeRecognizer === false, use label from application.recognizer
*/
this.taskIncludeRecognizer = !!this.data.recognizer;
if (this.data.recognizer) {
const recognizer = this.data.recognizer;
this.vendor = recognizer.vendor;
@@ -33,7 +68,6 @@ class SttTask extends Task {
//fallback
this.fallbackVendor = recognizer.fallbackVendor || 'default';
this.fallbackLanguage = recognizer.fallbackLanguage || 'default';
// label can be empty and should not have default value.
this.fallbackLabel = recognizer.fallbackLabel;
/* let credentials be supplied in the recognizer object at runtime */
@@ -51,6 +85,9 @@ class SttTask extends Task {
/*bug name prefix */
this.bugname_prefix = '';
// stt latency calculator
this.stt_latency_ms = '';
}
async exec(cs, {ep, ep2}) {
@@ -58,6 +95,12 @@ class SttTask extends Task {
this.ep = ep;
this.ep2 = ep2;
// start vad from stt latency calculator
if (this.name !== TaskName.Gather ||
this.name === TaskName.Gather && this.needsStt) {
cs.startSttLatencyVad();
}
// use session preferences if we don't have specific verb-level settings.
if (cs.recognizer) {
for (const k in cs.recognizer) {
@@ -82,8 +125,7 @@ class SttTask extends Task {
this.language = cs.speechRecognizerLanguage;
if (this.data.recognizer) this.data.recognizer.language = this.language;
}
// label can be empty, should not assign application level label
if ('default' === this.label) {
if (!this.taskIncludeRecognizer) {
this.label = cs.speechRecognizerLabel;
if (this.data.recognizer) this.data.recognizer.label = this.label;
}
@@ -96,17 +138,21 @@ class SttTask extends Task {
this.fallbackLanguage = cs.fallbackSpeechRecognizerLanguage;
if (this.data.recognizer) this.data.recognizer.fallbackLanguage = this.fallbackLanguage;
}
// label can be empty, should not assign application level label
if ('default' === this.fallbackLabel) {
if (!this.taskIncludeRecognizer) {
this.fallbackLabel = cs.fallbackSpeechRecognizerLabel;
if (this.data.recognizer) this.data.recognizer.fallbackLabel = this.fallbackLabel;
}
// If call is already fallback to 2nd ASR vendor
// use that.
if (cs.hasFallbackAsr) {
this.vendor = this.fallbackVendor;
this.language = this.fallbackLanguage;
this.label = this.fallbackLabel;
if (this.taskIncludeRecognizer) {
// reset fallback ASR from previous run if this verb contains data.recognizer.
cs.hasFallbackAsr = false;
} else {
this.logger.debug('Call session has fallback to 2nd ASR, use 2nd recognizer configuration');
this.vendor = this.fallbackVendor;
this.language = this.fallbackLanguage;
this.label = this.fallbackLabel;
}
}
if (!this.data.recognizer.vendor) {
this.data.recognizer.vendor = this.vendor;
@@ -157,6 +203,56 @@ class SttTask extends Task {
if (cs.hasGlobalSttPunctuation && !this.data.recognizer.punctuation) {
this.data.recognizer.punctuation = cs.globalSttPunctuation;
}
if (this.vendor === 'gladia') {
const { api_key, region } = this.sttCredentials;
const {url} = await this.createGladiaLiveSession({
api_key, region,
model: this.data.recognizer.model || 'solaria-1',
options: this.data.recognizer.gladiaOptions || {}
});
const {host, pathname, search} = new URL(url);
this.sttCredentials.host = host;
this.sttCredentials.path = `${pathname}${search}`;
}
}
async createGladiaLiveSession({
api_key,
region = 'us-west',
model = 'solaria-1',
options = {},
}) {
const url = `https://api.gladia.io/v2/live?region=${region}`;
const response = await fetch(url, {
method: 'POST',
headers: {
'x-gladia-key': api_key,
'Content-Type': 'application/json'
},
body: JSON.stringify({
encoding: 'wav/pcm',
bit_depth: 16,
sample_rate: 8000,
channels: 1,
model,
...options,
messages_config: {
receive_final_transcripts: true,
receive_speech_events: true,
receive_errors: true,
}
})
});
if (!response.ok) {
const error = await response.text();
this.logger.error({url, status: response.status, error}, 'Error creating Gladia live session');
throw new Error(`Error creating Gladia live session: ${response.status} ${error}`);
}
const data = await response.json();
this.logger.debug({url: data.url}, 'Gladia Call registered');
return data;
}
addCustomEventListener(ep, event, handler) {
@@ -178,10 +274,11 @@ class SttTask extends Task {
writeAlerts({
account_sid: cs.accountSid,
alert_type: AlertType.STT_NOT_PROVISIONED,
vendor
vendor,
target_sid: cs.callSid
}).catch((err) => this.logger.info({err}, 'Error generating alert for no stt'));
this.notifyTaskDone();
throw new Error(`No speech-to-text service credentials for ${vendor} have been configured`);
// the ASR might have fallback configuration, should not done task here.
throw new SpeechCredentialError(`No speech-to-text service credentials for ${vendor} have been configured`);
}
if (vendor === 'nuance' && credentials.client_id) {
@@ -205,14 +302,30 @@ class SttTask extends Task {
region,
roleArn
});
this.logger.debug({roleArn}, `got aws access token ${servedFromCache ? 'from cache' : ''}`);
credentials = {...credentials, accessKeyId, secretAccessKey, sessionToken};
} else if (vendor === 'verbio' && credentials.client_id && credentials.client_secret) {
this.logger.debug({roleArn}, `(roleArn) got aws access token ${servedFromCache ? 'from cache' : ''}`);
// from role ARN, we will get SessionToken, but feature server use it as securityToken.
credentials = {...credentials, accessKeyId, secretAccessKey, securityToken: sessionToken};
}
else if (vendor === 'verbio' && credentials.client_id && credentials.client_secret) {
const {access_token, servedFromCache} = await getVerbioAccessToken(credentials);
this.logger.debug({client_id: credentials.client_id},
`got verbio access token ${servedFromCache ? 'from cache' : ''}`);
credentials.access_token = access_token;
}
else if (vendor == 'aws' && !JAMBONES_AWS_TRANSCRIBE_USE_GRPC) {
/* get AWS access token */
const {speech_credential_sid, accessKeyId, secretAccessKey, securityToken, region } = credentials;
if (!securityToken) {
const { servedFromCache, ...newCredentials} = await getAwsAuthToken({
speech_credential_sid,
accessKeyId,
secretAccessKey,
region});
this.logger.debug({newCredentials}, `got aws security token ${servedFromCache ? 'from cache' : ''}`);
credentials = {...newCredentials, region};
}
}
return credentials;
}
@@ -222,12 +335,12 @@ class SttTask extends Task {
async _initFallback() {
assert(this.fallbackVendor, 'fallback failed without fallbackVendor configuration');
this.logger.info(`Failed to use primary STT provider, fallback to ${this.fallbackVendor}`);
this.isHandledByPrimaryProvider = false;
this.cs.hasFallbackAsr = true;
this.logger.info(`Failed to use primary STT provider, fallback to ${this.fallbackVendor}`);
this.vendor = this.fallbackVendor;
this.language = this.fallbackLanguage;
this.label = this.fallbackLabel;
this.vendor = this.cs.fallbackSpeechRecognizerVendor = this.fallbackVendor;
this.language = this.cs.fallbackSpeechRecognizerLanguage = this.fallbackLanguage;
this.label = this.cs.fallbackSpeechRecognizerLabel = this.fallbackLabel;
this.data.recognizer.vendor = this.vendor;
this.data.recognizer.language = this.language;
this.data.recognizer.label = this.label;
@@ -261,6 +374,57 @@ class SttTask extends Task {
});
}
formatOpenAIPrompt(cs, {prompt, hintsTemplate, conversationHistoryTemplate, hints}) {
let conversationHistoryPrompt, hintsPrompt;
/* generate conversation history from template */
if (conversationHistoryTemplate) {
const {processed, turns} = processTurnString(conversationHistoryTemplate);
this.logger.debug({processed, turns}, 'SttTask: processed conversation history template');
conversationHistoryPrompt = cs.getFormattedConversation(turns || 4);
//this.logger.debug({conversationHistoryPrompt}, 'SttTask: conversation history');
if (conversationHistoryPrompt) {
conversationHistoryPrompt = processed.replace('{{turns}}', `\n${conversationHistoryPrompt}\nuser: `);
}
}
/* generate hints from template */
if (hintsTemplate && Array.isArray(hints) && hints.length > 0) {
hintsPrompt = hintsTemplate.replace('{{hints}}', hints);
}
/* combine into final prompt */
let finalPrompt = prompt || '';
if (hintsPrompt) {
finalPrompt = `${finalPrompt}\n${hintsPrompt}`;
}
if (conversationHistoryPrompt) {
finalPrompt = `${finalPrompt}\n${conversationHistoryPrompt}`;
}
this.logger.debug({
finalPrompt,
hints,
hintsPrompt,
conversationHistoryTemplate,
conversationHistoryPrompt
}, 'SttTask: formatted OpenAI prompt');
return finalPrompt?.trimStart();
}
/* some STT engines will keep listening after a final response, so no need to restart */
doesVendorContinueListeningAfterFinalTranscript(vendor) {
return (vendor.startsWith('custom:') || [
'soniox',
'aws',
'microsoft',
'deepgram',
'google',
'speechmatics',
'openai',
].includes(vendor));
}
_onCompileContext(ep, key, evt) {
const {addKey} = this.cs.srf.locals.dbHelpers;
this.logger.debug({evt}, `received cobalt compile context event, will cache under ${key}`);
@@ -296,7 +460,7 @@ class SttTask extends Task {
dgOptions.utteranceEndMs = dgOptions.utteranceEndMs || asrTimeout;
}
_onVendorConnect(_cs, _ep) {
_onVendorConnect(cs, _ep) {
this.logger.debug(`TaskGather:_on${this.vendor}Connect`);
}
@@ -309,6 +473,7 @@ class SttTask extends Task {
message: 'STT failure reported by vendor',
detail: evt.error,
vendor: this.vendor,
target_sid: cs.callSid
}).catch((err) => this.logger.info({err}, `Error generating alert for ${this.vendor} connection failure`));
}
@@ -321,6 +486,7 @@ class SttTask extends Task {
alert_type: AlertType.STT_FAILURE,
message: `Failed connecting to ${this.vendor} speech recognizer: ${reason}`,
vendor: this.vendor,
target_sid: cs.callSid
}).catch((err) => this.logger.info({err}, `Error generating alert for ${this.vendor} connection failure`));
}
}

View File

@@ -1,5 +1,5 @@
const Emitter = require('events');
const uuidv4 = require('uuid-random');
const crypto = require('crypto');
const {TaskPreconditions} = require('../utils/constants');
const { normalizeJambones } = require('@jambonz/verb-specifications');
const WsRequestor = require('../utils/ws-requestor');
@@ -19,6 +19,7 @@ class Task extends Emitter {
this.data = data;
this.actionHook = this.data.actionHook;
this.id = data.id;
this.taskId = crypto.randomUUID();
this._killInProgress = false;
this._completionPromise = new Promise((resolve) => this._completionResolver = resolve);
@@ -165,7 +166,7 @@ class Task extends Emitter {
span.setAttributes({'http.body': JSON.stringify(params)});
try {
if (this.id) params.verb_id = this.id;
const json = await this.cs.requestor.request(type, this.actionHook, params, httpHeaders);
const json = await this.cs.requestor.request(type, this.actionHook, params, httpHeaders, span);
span.setAttributes({'http.statusCode': 200});
const isWsConnection = this.cs.requestor instanceof WsRequestor;
if (!isWsConnection || (expectResponse && json && Array.isArray(json) && json.length)) {
@@ -209,7 +210,7 @@ class Task extends Emitter {
const httpHeaders = b3 && {b3};
span.setAttributes({'http.body': JSON.stringify(params)});
try {
const json = await cs.requestor.request('verb:hook', hook, params, httpHeaders);
const json = await cs.requestor.request('verb:hook', hook, params, httpHeaders, span);
span.setAttributes({'http.statusCode': 200});
span.end();
if (json && Array.isArray(json)) {
@@ -272,7 +273,7 @@ class Task extends Emitter {
}
async transferCallToFeatureServer(cs, sipAddress, opts) {
const uuid = uuidv4();
const uuid = crypto.randomUUID();
const {addKey} = cs.srf.locals.dbHelpers;
const obj = Object.assign({}, cs.application);
delete obj.requestor;

View File

@@ -6,16 +6,25 @@ const {
AwsTranscriptionEvents,
AzureTranscriptionEvents,
DeepgramTranscriptionEvents,
GladiaTranscriptionEvents,
DeepgramfluxTranscriptionEvents,
SonioxTranscriptionEvents,
CobaltTranscriptionEvents,
IbmTranscriptionEvents,
NvidiaTranscriptionEvents,
JambonzTranscriptionEvents,
TranscribeStatus,
AssemblyAiTranscriptionEvents
AssemblyAiTranscriptionEvents,
HoundifyTranscriptionEvents,
VoxistTranscriptionEvents,
CartesiaTranscriptionEvents,
OpenAITranscriptionEvents,
VerbioTranscriptionEvents,
SpeechmaticsTranscriptionEvents
} = require('../utils/constants.json');
const { normalizeJambones } = require('@jambonz/verb-specifications');
const SttTask = require('./stt-task');
const { SpeechCredentialError } = require('../utils/error');
const STT_LISTEN_SPAN_NAME = 'stt-listen';
@@ -24,8 +33,8 @@ class TaskTranscribe extends SttTask {
super(logger, opts, parentTask);
this.transcriptionHook = this.data.transcriptionHook;
this.translationHook = this.data.translationHook;
this.earlyMedia = this.data.earlyMedia === true || (parentTask && parentTask.earlyMedia);
if (this.data.recognizer) {
this.interim = !!this.data.recognizer.interim;
this.separateRecognitionPerChannel = this.data.recognizer.separateRecognitionPerChannel;
@@ -73,7 +82,20 @@ class TaskTranscribe extends SttTask {
return this.channel === 2 || this.separateRecognitionPerChannel && this.ep2;
}
async exec(cs, {ep, ep2}) {
async exec(cs, obj) {
try {
await this.handling(cs, obj);
} catch (error) {
if (error instanceof SpeechCredentialError) {
this.logger.info('Transcribe failed due to SpeechCredentialError, finished!');
this.notifyTaskDone();
return;
}
throw error;
}
}
async handling(cs, {ep, ep2}) {
await super.exec(cs, {ep, ep2});
if (this.data.recognizer.vendor === 'nuance') {
@@ -84,11 +106,10 @@ class TaskTranscribe extends SttTask {
...this.data.recognizer.nuanceOptions
};
}
const {updateSpeechCredentialLastUsed} = require('../utils/db-utils')(this.logger, cs.srf);
if (cs.hasGlobalSttHints) {
const {hints, hintsBoost} = cs.globalSttHints;
this.data.recognizer.hints = this.data.recognizer.hints.concat(hints);
this.data.recognizer.hints = this.data.recognizer?.hints?.concat(hints);
if (!this.data.recognizer.hintsBoost && hintsBoost) this.data.recognizer.hintsBoost = hintsBoost;
this.logger.debug({hints: this.data.recognizer.hints, hintsBoost: this.data.recognizer.hintsBoost},
'Transcribe:exec - applying global sttHints');
@@ -101,9 +122,6 @@ class TaskTranscribe extends SttTask {
if (this.transcribing2) {
await this._startTranscribing(cs, ep2, 2);
}
updateSpeechCredentialLastUsed(this.sttCredentials.speech_credential_sid)
.catch(() => {/*already logged error */});
} catch (err) {
if (!(await this._startFallback(cs, ep, {error: err}))) {
this.logger.info(err, 'TaskTranscribe:exec - error');
@@ -122,22 +140,30 @@ class TaskTranscribe extends SttTask {
stopTranscription = true;
this.ep.stopTranscription({
vendor: this.vendor,
bugname: this.bugname
bugname: this.bugname,
gracefulShutdown: this.paused ? false : true
})
.catch((err) => this.logger.info(err, 'Error TaskTranscribe:kill'));
}
if (this.transcribing2 && this.ep2?.connected) {
stopTranscription = true;
this.ep2.stopTranscription({vendor: this.vendor, bugname: this.bugname})
this.ep2.stopTranscription({
vendor: this.vendor,
bugname: this.bugname,
gracefulShutdown: this.paused ? false : true
})
.catch((err) => this.logger.info(err, 'Error TaskTranscribe:kill'));
}
this.cs.emit('transcribe-stop');
return stopTranscription;
}
async kill(cs) {
super.kill(cs);
const stopTranscription = this._stopTranscription();
cs.stopSttLatencyVad();
// hangup after 1 sec if we don't get a final transcription
if (stopTranscription) this._timer = setTimeout(() => this.notifyTaskDone(), 1500);
else this.notifyTaskDone();
@@ -213,16 +239,48 @@ class TaskTranscribe extends SttTask {
this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, DeepgramTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
/* if app sets deepgramOptions.utteranceEndMs they essentially want continuous asr */
//if (opts.DEEPGRAM_SPEECH_UTTERANCE_END_MS) this.isContinuousAsr = true;
break;
case 'deepgramflux':
this.bugname = `${this.bugname_prefix}deepgramflux_transcribe`;
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, DeepgramfluxTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
break;
case 'gladia':
this.bugname = `${this.bugname_prefix}gladia_transcribe`;
this.addCustomEventListener(ep, GladiaTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, GladiaTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
break;
case 'soniox':
this.bugname = `${this.bugname_prefix}soniox_transcribe`;
this.addCustomEventListener(ep, SonioxTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
break;
case 'verbio':
this.bugname = `${this.bugname_prefix}verbio_transcribe`;
this.addCustomEventListener(
ep, VerbioTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep));
break;
case 'cobalt':
this.bugname = `${this.bugname_prefix}cobalt_transcribe`;
this.addCustomEventListener(ep, CobaltTranscriptionEvents.Transcription,
@@ -280,6 +338,72 @@ class TaskTranscribe extends SttTask {
this._onVendorConnectFailure.bind(this, cs, ep, channel));
break;
case 'houndify':
this.bugname = `${this.bugname_prefix}houndify_transcribe`;
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Error,
this._onVendorError.bind(this, cs, ep));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, HoundifyTranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
break;
case 'voxist':
this.bugname = `${this.bugname_prefix}voxist_transcribe`;
this.addCustomEventListener(ep, VoxistTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
this.addCustomEventListener(ep,
VoxistTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, VoxistTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
this.addCustomEventListener(ep, VoxistTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
break;
case 'cartesia':
this.bugname = `${this.bugname_prefix}cartesia_transcribe`;
this.addCustomEventListener(ep, CartesiaTranscriptionEvents.Transcription,
this._onTranscription.bind(this, cs, ep, channel));
this.addCustomEventListener(ep,
CartesiaTranscriptionEvents.Connect, this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, CartesiaTranscriptionEvents.Error, this._onVendorError.bind(this, cs, ep));
this.addCustomEventListener(ep, CartesiaTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep, channel));
break;
case 'speechmatics':
this.bugname = `${this.bugname_prefix}speechmatics_transcribe`;
this.addCustomEventListener(
ep, SpeechmaticsTranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep, channel));
this.addCustomEventListener(
ep, SpeechmaticsTranscriptionEvents.Translation, this._onTranslation.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, SpeechmaticsTranscriptionEvents.Info,
this._onSpeechmaticsInfo.bind(this, cs, ep));
this.addCustomEventListener(ep, SpeechmaticsTranscriptionEvents.RecognitionStarted,
this._onSpeechmaticsRecognitionStarted.bind(this, cs, ep));
this.addCustomEventListener(ep, SpeechmaticsTranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, SpeechmaticsTranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, SpeechmaticsTranscriptionEvents.Error,
this._onSpeechmaticsError.bind(this, cs, ep));
break;
case 'openai':
this.bugname = `${this.bugname_prefix}openai_transcribe`;
this.addCustomEventListener(
ep, OpenAITranscriptionEvents.Transcription, this._onTranscription.bind(this, cs, ep, channel));
this.addCustomEventListener(ep, OpenAITranscriptionEvents.Connect,
this._onVendorConnect.bind(this, cs, ep));
this.addCustomEventListener(ep, OpenAITranscriptionEvents.ConnectFailure,
this._onVendorConnectFailure.bind(this, cs, ep));
this.addCustomEventListener(ep, OpenAITranscriptionEvents.Error,
this._onOpenAIErrror.bind(this, cs, ep));
this.modelSupportsConversationTracking = opts.OPENAI_MODEL !== 'whisper-1';
break;
default:
if (this.vendor.startsWith('custom:')) {
this.bugname = `${this.bugname_prefix}${this.vendor}_transcribe`;
@@ -315,6 +439,25 @@ class TaskTranscribe extends SttTask {
async _transcribe(ep) {
this.logger.debug(
`TaskTranscribe:_transcribe - starting transcription vendor ${this.vendor} bugname ${this.bugname}`);
/* special feature for openai: we can provide a prompt that includes recent conversation history */
let prompt;
if (this.vendor === 'openai') {
if (this.modelSupportsConversationTracking) {
prompt = this.formatOpenAIPrompt(this.cs, {
prompt: this.data.recognizer?.openaiOptions?.prompt,
hintsTemplate: this.data.recognizer?.openaiOptions?.promptTemplates?.hintsTemplate,
// eslint-disable-next-line max-len
conversationHistoryTemplate: this.data.recognizer?.openaiOptions?.promptTemplates?.conversationHistoryTemplate,
hints: this.data.recognizer?.hints,
});
this.logger.debug({prompt}, 'Gather:_startTranscribing - created an openai prompt');
}
else if (this.data.recognizer?.hints?.length > 0) {
prompt = this.data.recognizer?.hints.join(', ');
}
}
await ep.startTranscription({
vendor: this.vendor,
interim: this.interim ? true : false,
@@ -323,9 +466,16 @@ class TaskTranscribe extends SttTask {
bugname: this.bugname,
hostport: this.hostport
});
// Some vendor use single connection, that we cannot use onConnect event to track transcription start
this.cs.emit('transcribe-start');
}
async _onTranscription(cs, ep, channel, evt, fsEvent) {
// check if we are in graceful shutdown mode
if (ep.gracefulShutdownResolver) {
ep.gracefulShutdownResolver();
}
// make sure this is not a transcript from answering machine detection
const bugname = fsEvent.getHeader('media-bugname');
const finished = fsEvent.getHeader('transcription-session-finished');
@@ -337,6 +487,9 @@ class TaskTranscribe extends SttTask {
if (this.vendor === 'ibm' && evt?.state === 'listening') return;
// emit an event to the call session to track the time transcription is received
cs.emit('on-transcription');
if (this.vendor === 'deepgram' && evt.type === 'UtteranceEnd') {
/* we will only get this when we have set utterance_end_ms */
@@ -406,8 +559,9 @@ class TaskTranscribe extends SttTask {
this._startAsrTimer(channel);
/* some STT engines will keep listening after a final response, so no need to restart */
if (!['soniox', 'aws', 'microsoft', 'deepgram', 'google']
.includes(this.vendor)) this._startTranscribing(cs, ep, channel);
if (!this.doesVendorContinueListeningAfterFinalTranscript(this.vendor)) {
this._startTranscribing(cs, ep, channel);
}
}
else {
if (this.vendor === 'soniox') {
@@ -430,9 +584,7 @@ class TaskTranscribe extends SttTask {
this.logger.debug({evt}, 'TaskTranscribe:_onTranscription - sending final transcript');
this._resolve(channel, evt);
/* some STT engines will keep listening after a final response, so no need to restart */
if (!['soniox', 'aws', 'microsoft', 'deepgram', 'google'].includes(this.vendor) &&
!this.vendor.startsWith('custom:')) {
if (!this.doesVendorContinueListeningAfterFinalTranscript(this.vendor)) {
this.logger.debug('TaskTranscribe:_onTranscription - restarting transcribe');
this._startTranscribing(cs, ep, channel);
}
@@ -457,14 +609,70 @@ class TaskTranscribe extends SttTask {
}
}
async _onTranslation(_cs, _ep, channel, evt, _fsEvent) {
this.logger.debug({evt}, 'TaskTranscribe:_onTranslation');
if (this.translationHook && evt.results?.length > 0) {
try {
const b3 = this.getTracingPropagation();
const httpHeaders = b3 && {b3};
const payload = {
...this.cs.callInfo,
...httpHeaders,
translation: {
channel,
language: evt.language,
translation: evt.results[0].content
}
};
this.logger.debug({payload}, 'sending translationHook');
const json = await this.cs.requestor.request('verb:hook', this.translationHook, payload);
this.logger.info({json}, 'completed translationHook');
if (json && Array.isArray(json) && !this.parentTask) {
const makeTask = require('./make_task');
const tasks = normalizeJambones(this.logger, json).map((tdata) => makeTask(this.logger, tdata));
if (tasks && tasks.length > 0) {
this.logger.info({tasks: tasks}, `${this.name} replacing application with ${tasks.length} tasks`);
this.cs.replaceApplication(tasks);
}
}
} catch (err) {
this.logger.info(err, 'TranscribeTask:_onTranslation error');
}
if (this.parentTask) {
this.parentTask.emit('translation', evt);
}
}
if (this.killed) {
this.logger.debug('TaskTranscribe:_onTranslation exiting after receiving final transcription');
this._clearTimer();
this.notifyTaskDone();
}
}
async _resolve(channel, evt) {
let sttLatencyMetrics = {};
if (evt.is_final) {
const sttLatency = this.cs.calculateSttLatency();
if (sttLatency) {
sttLatencyMetrics = {
'stt.latency_ms': `${sttLatency.stt_latency_ms}`,
'stt.talkspurts': JSON.stringify(sttLatency.talkspurts),
'stt.start_time': sttLatency.stt_start_time,
'stt.stop_time': sttLatency.stt_stop_time,
'stt.usage': sttLatency.stt_usage,
};
}
// time to reset the stt latency
this.cs.emit('transcribe-start');
/* we've got a final transcript, so end the otel child span for this channel */
if (this.childSpan[channel - 1] && this.childSpan[channel - 1].span) {
this.childSpan[channel - 1].span.setAttributes({
channel,
'stt.label': this.label || 'None',
'stt.resolve': 'transcript',
'stt.result': JSON.stringify(evt)
'stt.result': JSON.stringify(evt),
...sttLatencyMetrics
});
this.childSpan[channel - 1].span.end();
}
@@ -473,9 +681,13 @@ class TaskTranscribe extends SttTask {
if (this.transcriptionHook) {
const b3 = this.getTracingPropagation();
const httpHeaders = b3 && {b3};
const latencies = Object.fromEntries(
Object.entries(sttLatencyMetrics).map(([key, value]) => [key.replace('stt.', 'stt_'), value])
);
const payload = {
...this.cs.callInfo,
...httpHeaders,
...latencies,
...(evt.alternatives && {speech: evt}),
...(evt.type && {speechEvent: evt})
};
@@ -516,7 +728,8 @@ class TaskTranscribe extends SttTask {
if (this.childSpan[channel - 1] && this.childSpan[channel - 1].span) {
this.childSpan[channel - 1].span.setAttributes({
channel,
'stt.resolve': 'timeout'
'stt.resolve': 'timeout',
'stt.label': this.label || 'None',
});
this.childSpan[channel - 1].span.end();
}
@@ -528,12 +741,22 @@ class TaskTranscribe extends SttTask {
}
_onMaxDurationExceeded(cs, ep, channel) {
this.logger.debug(`TaskTranscribe:_onMaxDurationExceeded on channel ${channel}`);
this.restartDueToError(ep, channel, 'Max duration exceeded');
}
_onMaxBufferExceeded(cs, ep, channel) {
this.restartDueToError(ep, channel, 'Max buffer exceeded');
}
restartDueToError(ep, channel, reason) {
this.logger.debug(`TaskTranscribe:${reason} on channel ${channel}`);
if (this.paused) return;
if (this.childSpan[channel - 1] && this.childSpan[channel - 1].span) {
this.childSpan[channel - 1].span.setAttributes({
channel,
'stt.resolve': 'max duration exceeded'
'stt.resolve': reason,
'stt.label': this.label || 'None',
});
this.childSpan[channel - 1].span.end();
}
@@ -556,10 +779,10 @@ class TaskTranscribe extends SttTask {
if (this.canFallback) {
_ep.stopTranscription({
vendor: this.vendor,
bugname: this.bugname
bugname: this.bugname,
gracefulShutdown: false
})
.catch((err) => this.logger.error({err}, `Error stopping transcription for primary vendor ${this.vendor}`));
const {updateSpeechCredentialLastUsed} = require('../utils/db-utils')(this.logger, cs.srf);
try {
this.notifyError({ msg: 'ASR error',
details:`STT Vendor ${this.vendor} error: ${evt.error || evt.reason}`, failover: 'in progress'});
@@ -570,7 +793,6 @@ class TaskTranscribe extends SttTask {
}
this[`_speechHandlersSet_${channel}`] = false;
this._startTranscribing(cs, _ep, channel);
updateSpeechCredentialLastUsed(this.sttCredentials.speech_credential_sid);
return true;
} catch (error) {
this.notifyError({ msg: 'ASR error',
@@ -591,6 +813,14 @@ class TaskTranscribe extends SttTask {
return;
}
this.logger.info({evt}, 'TaskTranscribe:_onJambonzError');
if (this.vendor === 'microsoft' &&
evt.error?.includes('Due to service inactivity, the client buffer exceeded maximum size. Resetting the buffer')) {
let channel = 1;
if (this.ep !== _ep) {
channel = 2;
}
return this._onMaxBufferExceeded(cs, _ep, channel);
}
if (this.paused) return;
const {writeAlerts, AlertType} = cs.srf.locals;
@@ -605,6 +835,7 @@ class TaskTranscribe extends SttTask {
alert_type: AlertType.STT_FAILURE,
message: `Custom speech vendor ${this.vendor} error: ${evt.error}`,
vendor: this.vendor,
target_sid: cs.callSid
}).catch((err) => this.logger.info({err}, 'Error generating alert for jambonz custom connection failure'));
if (!(await this._startFallback(cs, _ep, evt))) {
this.notifyTaskDone();
@@ -616,7 +847,8 @@ class TaskTranscribe extends SttTask {
if (this.childSpan[channel - 1] && this.childSpan[channel - 1].span) {
this.childSpan[channel - 1].span.setAttributes({
channel,
'stt.resolve': 'connection failure'
'stt.resolve': 'connection failure',
'stt.label': this.label || 'None',
});
this.childSpan[channel - 1].span.end();
}
@@ -625,6 +857,26 @@ class TaskTranscribe extends SttTask {
}
}
async _onSpeechmaticsRecognitionStarted(_cs, _ep, evt) {
this.logger.debug({evt}, 'TaskGather:_onSpeechmaticsRecognitionStarted');
}
async _onSpeechmaticsInfo(_cs, _ep, evt) {
this.logger.debug({evt}, 'TaskGather:_onSpeechmaticsInfo');
}
async _onSpeechmaticsError(cs, _ep, evt) {
// eslint-disable-next-line no-unused-vars
const {message, ...e} = evt;
this._onVendorError(cs, _ep, {error: JSON.stringify(e)});
}
async _onOpenAIErrror(cs, _ep, evt) {
// eslint-disable-next-line no-unused-vars
const {message, ...e} = evt;
this._onVendorError(cs, _ep, {error: JSON.stringify(e)});
}
_startAsrTimer(channel) {
if (this.vendor === 'deepgram') return; // no need
assert(this.isContinuousAsr);

View File

@@ -1,5 +1,17 @@
const Task = require('./task');
const { TaskPreconditions } = require('../utils/constants');
const { SpeechCredentialError } = require('../utils/error');
const dbUtils = require('../utils/db-utils');
const extractPlaybackId = (str) => {
// Match say:{...} and capture the content inside braces
const match = str.match(/say:\{([^}]*)\}/);
if (!match) return null;
// Look for playback_id=value within the captured content
const playbackMatch = match[1].match(/playback_id=([^,]*)/);
return playbackMatch ? playbackMatch[1] : null;
};
class TtsTask extends Task {
@@ -10,41 +22,179 @@ class TtsTask extends Task {
this.preconditions = TaskPreconditions.Endpoint;
this.earlyMedia = this.data.earlyMedia === true || (parentTask && parentTask.earlyMedia);
/**
* Task use taskIncludeSynthesizer to identify
* if taskIncludeSynthesizer === true, use label from verb.synthesizer, even it's empty
* if taskIncludeSynthesizer === false, use label from application.synthesizer
*/
this.taskIncludeSynthesizer = !!this.data.synthesizer;
this.synthesizer = this.data.synthesizer || {};
this.disableTtsCache = this.data.disableTtsCache;
this.options = this.synthesizer.options || {};
this.instructions = this.data.instructions;
this.playbackIds = [];
}
getPlaybackId(offset) {
return this.playbackIds[offset];
}
async exec(cs) {
super.exec(cs);
// update disableTtsCache from call session if not set in task
if (this.data.disableTtsCache == null) {
this.disableTtsCache = cs.disableTtsCache;
}
if (cs.synthesizer) {
this.options = {...cs.synthesizer.options, ...this.options};
this.data.synthesizer = this.data.synthesizer || {};
for (const k in cs.synthesizer) {
const newValue = this.data.synthesizer && this.data.synthesizer[k] !== undefined ?
this.data.synthesizer[k] :
cs.synthesizer[k];
if (Array.isArray(newValue)) {
this.data.synthesizer[k] = [...(this.data.synthesizer[k] || []), ...cs.synthesizer[k]];
} else if (typeof newValue === 'object' && newValue !== null) {
this.data.synthesizer[k] = { ...(this.data.synthesizer[k] || {}), ...cs.synthesizer[k] };
} else {
this.data.synthesizer[k] = newValue;
}
}
}
const fullText = Array.isArray(this.text) ? this.text.join(' ') : this.text;
// in case dub verb, text might not be set.
if (fullText?.length > 0) {
cs.emit('botSaid', fullText);
}
}
async _synthesizeWithSpecificVendor(cs, ep, {
vendor,
language,
voice,
label,
disableTtsStreaming,
preCache
}) {
getTtsVendorData(cs) {
const vendor = this.synthesizer.vendor && this.synthesizer.vendor !== 'default' ?
this.synthesizer.vendor :
cs.speechSynthesisVendor;
const language = this.synthesizer.language && this.synthesizer.language !== 'default' ?
this.synthesizer.language :
cs.speechSynthesisLanguage ;
const voice = this.synthesizer.voice && this.synthesizer.voice !== 'default' ?
this.synthesizer.voice :
cs.speechSynthesisVoice;
const label = this.taskIncludeSynthesizer ? this.synthesizer.label : cs.speechSynthesisLabel;
return {vendor, language, voice, label};
}
async setTtsStreamingChannelVars(vendor, language, voice, credentials, ep) {
const {api_key, model_id, api_uri, custom_tts_streaming_url, auth_token, options} = credentials;
// api_key, model_id, api_uri, custom_tts_streaming_url, and auth_token are encoded in the credentials
// allow them to be overriden via config, using options
// give preference to options passed in via config
const local_options = {...JSON.parse(options), ...this.options};
const local_voice_settings = {...JSON.parse(options).voice_settings, ...this.options.voice_settings};
const local_api_key = local_options.api_key ?? api_key;
const local_model_id = local_options.model_id ?? model_id;
const local_api_uri = local_options.api_uri ?? api_uri;
const local_custom_tts_streaming_url = local_options.custom_tts_streaming_url ?? custom_tts_streaming_url;
const local_auth_token = local_options.auth_token ?? auth_token;
let obj;
switch (vendor) {
case 'deepgram':
obj = {
DEEPGRAM_API_KEY: local_api_key,
DEEPGRAM_TTS_STREAMING_MODEL: voice
};
break;
case 'cartesia':
obj = {
CARTESIA_API_KEY: local_api_key,
CARTESIA_TTS_STREAMING_MODEL_ID: local_model_id,
CARTESIA_TTS_STREAMING_VOICE_ID: voice,
CARTESIA_TTS_STREAMING_LANGUAGE: language || 'en',
};
break;
case 'elevenlabs':
// eslint-disable-next-line max-len
const {stability, similarity_boost, use_speaker_boost, style, speed} = local_voice_settings || {};
obj = {
ELEVENLABS_API_KEY: local_api_key,
...(api_uri && {ELEVENLABS_API_URI: local_api_uri}),
ELEVENLABS_TTS_STREAMING_MODEL_ID: local_model_id,
ELEVENLABS_TTS_STREAMING_VOICE_ID: voice,
// 20/12/2024 - only eleven_turbo_v2_5 support multiple language
...(['eleven_turbo_v2_5'].includes(local_model_id) && {ELEVENLABS_TTS_STREAMING_LANGUAGE: language}),
...(stability && {ELEVENLABS_TTS_STREAMING_VOICE_SETTINGS_STABILITY: stability}),
...(similarity_boost && {ELEVENLABS_TTS_STREAMING_VOICE_SETTINGS_SIMILARITY_BOOST: similarity_boost}),
...(use_speaker_boost && {ELEVENLABS_TTS_STREAMING_VOICE_SETTINGS_USE_SPEAKER_BOOST: use_speaker_boost}),
...(style && {ELEVENLABS_TTS_STREAMING_VOICE_SETTINGS_STYLE: style}),
// speed has value 0.7 to 1.2, 1.0 is default, make sure we send the value event it's 0
...(speed !== null && speed !== undefined && {ELEVENLABS_TTS_STREAMING_VOICE_SETTINGS_SPEED: `${speed}`}),
...(local_options.pronunciation_dictionary_locators &&
Array.isArray(local_options.pronunciation_dictionary_locators) && {
ELEVENLABS_TTS_STREAMING_PRONUNCIATION_DICTIONARY_LOCATORS:
JSON.stringify(local_options.pronunciation_dictionary_locators)
}),
};
break;
case 'rimelabs':
const {
pauseBetweenBrackets, phonemizeBetweenBrackets, inlineSpeedAlpha, speedAlpha, reduceLatency
} = local_options;
obj = {
RIMELABS_API_KEY: local_api_key,
RIMELABS_TTS_STREAMING_MODEL_ID: local_model_id,
RIMELABS_TTS_STREAMING_VOICE_ID: voice,
RIMELABS_TTS_STREAMING_LANGUAGE: language || 'en',
...(pauseBetweenBrackets && {RIMELABS_TTS_STREAMING_PAUSE_BETWEEN_BRACKETS: pauseBetweenBrackets}),
...(phonemizeBetweenBrackets &&
{RIMELABS_TTS_STREAMING_PHONEMIZE_BETWEEN_BRACKETS: phonemizeBetweenBrackets}),
...(inlineSpeedAlpha && {RIMELABS_TTS_STREAMING_INLINE_SPEED_ALPHA: inlineSpeedAlpha}),
...(speedAlpha && {RIMELABS_TTS_STREAMING_SPEED_ALPHA: speedAlpha}),
...(reduceLatency && {RIMELABS_TTS_STREAMING_REDUCE_LATENCY: reduceLatency})
};
break;
default:
if (vendor.startsWith('custom:')) {
const use_tls = custom_tts_streaming_url.startsWith('wss://');
obj = {
CUSTOM_TTS_STREAMING_HOST: local_custom_tts_streaming_url.replace(/^(ws|wss):\/\//, ''),
CUSTOM_TTS_STREAMING_API_KEY: local_auth_token,
CUSTOM_TTS_STREAMING_VOICE_ID: voice,
CUSTOM_TTS_STREAMING_LANGUAGE: language || 'en',
CUSTOM_TTS_STREAMING_USE_TLS: use_tls
};
} else {
throw new Error(`vendor ${vendor} is not supported for tts streaming yet`);
}
}
this.logger.debug({vendor, credentials, obj}, 'setTtsStreamingChannelVars');
await ep.set(obj);
}
async _synthesizeWithSpecificVendor(cs, ep, {vendor, language, voice, label, preCache = false}) {
const {srf, accountSid:account_sid} = cs;
const {updateSpeechCredentialLastUsed} = require('../utils/db-utils')(this.logger, srf);
const {writeAlerts, AlertType, stats} = srf.locals;
const {synthAudio} = srf.locals.dbHelpers;
const engine = this.synthesizer.engine || cs.synthesizer?.engine || 'neural';
const salt = cs.callSid;
let credentials = cs.getSpeechCredentials(vendor, 'tts', label);
if (!credentials) {
throw new SpeechCredentialError(
`No text-to-speech service credentials for ${vendor} with labels: ${label} have been configured`);
}
/* parse Nuance voices into name and model */
let model;
if (vendor === 'nuance' && voice) {
const arr = /([A-Za-z-]*)\s+-\s+(enhanced|standard)/.exec(voice);
if (arr) {
voice = arr[1];
model = arr[2];
this.model = arr[2];
}
} else if (vendor === 'deepgram') {
model = voice;
this.model = voice;
}
/* allow for microsoft custom region voice and api_key to be specified as an override */
@@ -62,42 +212,84 @@ class TtsTask extends Task {
credentials.optimize_streaming_latency = this.options.optimize_streaming_latency
|| credentials.optimize_streaming_latency;
voice = this.options.voice_id || voice;
} else if (vendor === 'rimelabs') {
credentials = credentials || {};
credentials.model_id = this.options.model_id || credentials.model_id;
} else if (vendor === 'inworld') {
credentials = credentials || {};
credentials.model_id = this.options.model_id || credentials.model_id;
} else if (vendor === 'whisper') {
credentials = credentials || {};
credentials.model_id = this.options.model_id || credentials.model_id;
} else if (vendor === 'verbio') {
credentials = credentials || {};
credentials.engine_version = this.options.engine_version || credentials.engine_version;
} else if (vendor === 'playht') {
credentials = credentials || {};
credentials.voice_engine = this.options.voice_engine || credentials.voice_engine;
} else if (vendor === 'google' && typeof voice === 'string' && voice.startsWith('custom_')) {
const {lookupGoogleCustomVoice} = dbUtils(this.logger, cs.srf);
const arr = /custom_(.*)/.exec(voice);
if (arr) {
const google_custom_voice_sid = arr[1];
const [custom_voice] = await lookupGoogleCustomVoice(google_custom_voice_sid);
if (custom_voice.use_voice_cloning_key) {
voice = {
voice_cloning_key: custom_voice.voice_cloning_key,
};
}
}
} else if (vendor === 'cartesia') {
credentials.model_id = this.options.model_id || credentials.model_id;
}
ep.set({
tts_engine: vendor,
tts_voice: voice,
cache_speech_handles: 1,
}).catch((err) => this.logger.info({err}, `${this.name}: Error setting tts_engine on endpoint`));
this.model_id = credentials.model_id;
if (!preCache) this.logger.info({vendor, language, voice, model}, `${this.name}:exec`);
/**
* note on cache_speech_handles. This was found to be risky.
* It can cause a crash in the following sequence on a single call:
* 1. Stream tts on vendor A with cache_speech_handles=1, then
* 2. Stream tts on vendor B with cache_speech_handles=1
*
* we previously tried to track when vendors were switched and manage the flag accordingly,
* but it difficult to track all the scenarios and the benefit (slightly faster start to tts playout)
* is probably minimal. DH.
*/
ep.set({
tts_engine: vendor.startsWith('custom:') ? 'custom' : vendor,
tts_voice: voice,
//cache_speech_handles: !cs.currentTtsVendor || cs.currentTtsVendor === vendor ? 1 : 0,
cache_speech_handles: 0,
}).catch((err) => this.logger.info({err}, 'Error setting tts_engine on endpoint'));
// set the current vendor on the call session
// If vendor is changed from the previous one, then reset the cache_speech_handles flag
//cs.currentTtsVendor = vendor;
if (!preCache && !this._disableTracing)
this.logger.debug({vendor, language, voice, model: this.model}, 'TaskSay:exec');
try {
if (!credentials) {
writeAlerts({
account_sid,
alert_type: AlertType.TTS_NOT_PROVISIONED,
vendor
vendor,
target_sid: cs.callSid
}).catch((err) => this.logger.info({err}, 'Error generating alert for no tts'));
this.notifyError({
msg: 'TTS error',
details:`No speech credentials provisioned for selected vendor ${vendor}`
});
throw new Error('no provisioned speech credentials for TTS');
throw new SpeechCredentialError('no provisioned speech credentials for TTS');
}
// synthesize all of the text elements
let lastUpdated = false;
/* produce an audio segment from the provided text */
const generateAudio = async(text) => {
if (this.killed) return;
if (text.startsWith('silence_stream://')) return text;
const generateAudio = async(text, index) => {
if (this.killed) return {index, filePath: null};
if (text.startsWith('silence_stream://')) return {index, filePath: text};
/* otel: trace time for tts */
if (!preCache && !this.parentTask) {
if (!preCache && !this._disableTracing) {
const {span} = this.startChildSpan('tts-generation', {
'tts.vendor': vendor,
'tts.language': language,
'tts.voice': voice
'tts.voice': voice,
'tts.label': label || 'None',
});
this.otelSpan = span;
}
@@ -105,46 +297,61 @@ class TtsTask extends Task {
const {filePath, servedFromCache, rtt} = await synthAudio(stats, {
account_sid,
text,
instructions: this.instructions,
vendor,
language,
voice,
engine,
model,
model: this.model,
salt,
credentials,
options: this.options,
disableTtsCache : this.disableTtsCache,
disableTtsStreaming,
preCache
renderForCaching: preCache
});
if (!filePath.startsWith('say:')) {
this.logger.debug(`file ${filePath}, served from cache ${servedFromCache}`);
this.logger.debug(`Say: file ${filePath}, served from cache ${servedFromCache}`);
if (filePath) cs.trackTmpFile(filePath);
if (this.otelSpan) {
this.otelSpan.setAttributes({'tts.cached': servedFromCache});
this.otelSpan.end();
this.otelSpan = null;
}
if (!servedFromCache && !lastUpdated) {
lastUpdated = true;
updateSpeechCredentialLastUsed(credentials.speech_credential_sid).catch(() => {/* logged error */});
}
if (!servedFromCache && rtt && !preCache) {
if (!servedFromCache && rtt && !preCache && !this._disableTracing) {
this.notifyStatus({
event: 'synthesized-audio',
vendor,
language,
characters: text.length,
elapsedTime: rtt
elapsedTime: rtt,
servedFromCache,
'id': this.id
});
}
if (servedFromCache) {
this.notifyStatus({
event: 'synthesized-audio',
vendor,
language,
servedFromCache,
'id': this.id
});
}
return {index, filePath, playbackId: null};
}
else {
this.logger.debug('a streaming tts api will be used');
const playbackId = extractPlaybackId(filePath);
this.logger.debug('Say: a streaming tts api will be used');
const modifiedPath = filePath.replace('say:{', `say:{session-uuid=${ep.uuid},`);
return modifiedPath;
this.notifyStatus({
event: 'synthesized-audio',
vendor,
language,
servedFromCache,
'id': this.id
});
return {index, filePath: modifiedPath, playbackId};
}
return filePath;
} catch (err) {
this.logger.info({err}, 'Error synthesizing tts');
if (this.otelSpan) this.otelSpan.end();
@@ -152,19 +359,32 @@ class TtsTask extends Task {
account_sid: cs.accountSid,
alert_type: AlertType.TTS_FAILURE,
vendor,
detail: err.message
detail: err.message,
target_sid: cs.callSid
}).catch((err) => this.logger.info({err}, 'Error generating alert for tts failure'));
this.notifyError({msg: 'TTS error', details: err.message || err});
throw err;
}
};
const arr = this.text.map((t) => (this._validateURL(t) ? t : generateAudio(t)));
return (await Promise.all(arr)).filter((fp) => fp && fp.length);
// process all text segments in parallel will cause ordering issue
// so we attach index to each promise result and sort them later
const arr = this.text.map((t, index) => (this._validateURL(t) ?
Promise.resolve({index, filePath: t, playbackId: null}) : generateAudio(t, index)));
const results = await Promise.all(arr);
const sorted = results.sort((a, b) => a.index - b.index);
return sorted
.filter((fp) => fp.filePath && fp.filePath.length)
.map((r) => {
this.playbackIds.push(r.playbackId);
return r.filePath;
});
} catch (err) {
this.logger.info(err, 'TaskSay:exec error');
throw err;
}
}
_validateURL(urlString) {

View File

@@ -2,7 +2,6 @@ const makeTask = require('../tasks/make_task');
const Emitter = require('events');
const { normalizeJambones } = require('@jambonz/verb-specifications');
const {TaskName} = require('../utils/constants');
const assert = require('assert');
/**
* ActionHookDelayProcessor
@@ -25,10 +24,12 @@ class ActionHookDelayProcessor extends Emitter {
this._active = false;
const enabled = this.init(opts);
if (enabled && (!this.actions || !Array.isArray(this.actions) || this.actions.length === 0)) {
if (enabled && this.noResponseTimeout &&
(!this.actions || !Array.isArray(this.actions) || this.actions.length === 0)) {
throw new Error('ActionHookDelayProcessor: no actions specified');
}
else if (enabled && this.actions.some((a) => !a.verb || ![TaskName.Say, TaskName.Play].includes(a.verb))) {
else if (enabled && this.actions &&
this.actions.some((a) => !a.verb || ![TaskName.Say, TaskName.Play].includes(a.verb))) {
throw new Error(`ActionHookDelayProcessor: invalid actions specified: ${JSON.stringify(this.actions)}`);
}
}
@@ -51,8 +52,9 @@ class ActionHookDelayProcessor extends Emitter {
this.actions = opts.actions;
this.retries = opts.retries || 0;
this.noResponseTimeout = opts.noResponseTimeout || 0;
this.noResponseTimeout = opts.noResponseTimeout;
this.noResponseGiveUpTimeout = opts.noResponseGiveUpTimeout;
this.giveUpActions = opts.giveUpActions;
// return false if these options actually disable the ahdp
return ('enable' in opts && opts.enable === true) ||
@@ -66,11 +68,16 @@ class ActionHookDelayProcessor extends Emitter {
this.logger.debug('ActionHookDelayProcessor#start: already started due to prior gather which is continuing');
return;
}
assert(!this._noResponseTimer);
this._active = true;
this._retryCount = 0;
const timeoutMs = this.noResponseTimeout === 0 ? 1 : this.noResponseTimeout * 1000;
this._noResponseTimer = setTimeout(this._onNoResponseTimer.bind(this), timeoutMs);
if (this.noResponseTimeout > 0) {
const timeoutMs = this.noResponseTimeout * 1000;
this._noResponseTimer = setTimeout(this._onNoResponseTimer.bind(this), timeoutMs);
} else {
this.logger.debug(
'ActionHookDelayProcessor#start: noResponseTimeout is 0 or undefined hence not calling _onNoResponseTimer'
);
}
if (this.noResponseGiveUpTimeout > 0) {
const timeoutMs = this.noResponseGiveUpTimeout * 1000;
@@ -79,7 +86,6 @@ class ActionHookDelayProcessor extends Emitter {
}
async stop() {
this.logger.debug('ActionHookDelayProcessor#stop');
this._active = false;
if (this._noResponseTimer) {
@@ -91,25 +97,19 @@ class ActionHookDelayProcessor extends Emitter {
this._noResponseGiveUpTimer = null;
}
if (this._taskInProgress) {
this.logger.debug(`ActionHookDelayProcessor#stop: killing task in progress: ${this._taskInProgress.name}`);
this.logger.debug(`ActionHookDelayProcessor#stop: stopping ${this._taskInProgress.name}`);
/** if we are doing a play, kill it immediately
* if we are doing a say, wait for it to finish
*/
if (TaskName.Say === this._taskInProgress.name) {
this._sayResolver = () => {
this.logger.debug('ActionHookDelayProcessor#stop: say is done, continue on..');
this._taskInProgress.kill(this.cs);
this._taskInProgress = null;
};
this.logger.debug('ActionHookDelayProcessor#stop returning promise');
return new Promise((resolve) => this._sayResolver = resolve);
}
else {
/* play */
this._taskInProgress.kill(this.cs);
this._sayResolver = () => {
this.logger.debug('ActionHookDelayProcessor#stop: play/say is done, continue on..');
//this._taskInProgress.kill(this.cs);
this._taskInProgress = null;
};
/* we let Say finish, but interrupt Play */
if (TaskName.Play === this._taskInProgress.name) {
await this._taskInProgress.kill(this.cs);
}
return new Promise((resolve) => this._sayResolver = resolve);
}
this.logger.debug('ActionHookDelayProcessor#stop returning');
}
@@ -126,7 +126,12 @@ class ActionHookDelayProcessor extends Emitter {
try {
this._taskInProgress = makeTask(this.logger, t[0]);
this._taskInProgress.disableTracing = true;
this._taskInProgress.exec(this.cs, {ep: this.ep});
this._taskInProgress.exec(this.cs, {ep: this.ep}).catch((err) => {
this.logger.info(`ActionHookDelayProcessor#_onNoResponseTimer: error playing file: ${err.message}`);
this._taskInProgress = null;
this.ep.removeAllListeners('playback-start');
this.ep.removeAllListeners('playback-stop');
});
} catch (err) {
this.logger.info(err, 'ActionHookDelayProcessor#_onNoResponseTimer: error starting action');
this._taskInProgress = null;
@@ -137,7 +142,9 @@ class ActionHookDelayProcessor extends Emitter {
this.logger.debug({evt}, 'got playback-start');
if (!this._active) {
this.logger.info({evt}, 'ActionHookDelayProcessor#_onNoResponseTimer: killing audio immediately');
this.ep.api('uuid_break', this.ep.uuid)
/* note: in race condition we may have just hung up and cs.ep cleared */
this.ep?.api('uuid_break', this.ep?.uuid)
.catch((err) => this.logger.info(err,
'ActionHookDelayProcessor#_onNoResponseTimer Error killing audio'));
}
@@ -147,7 +154,7 @@ class ActionHookDelayProcessor extends Emitter {
this._taskInProgress = null;
if (this._sayResolver) {
/* we were waiting for the play to finish before continuing to next task */
this.logger.debug({evt}, 'got playback-stop');
this.logger.debug({evt}, 'ActionHookDelayProcessor#_onNoResponseTimer got playback-stop');
this._sayResolver();
this._sayResolver = null;
}
@@ -166,9 +173,14 @@ class ActionHookDelayProcessor extends Emitter {
_onNoResponseGiveUpTimer() {
this._active = false;
this.logger.info('ActionHookDelayProcessor#_onNoResponseGiveUpTimer');
this.stop().catch((err) => {});
this.emit('giveup');
if (!this.giveUpActions) {
this.logger.info('ActionHookDelayProcessor#_onNoResponseGiveUpTimer');
this.stop().catch((err) => {});
this.emit('giveup');
} else {
this.logger.info('ActionHookDelayProcessor#_onNoResponseGiveUpTimer - giveUpActions');
this.emit('giveupWithTasks', this.giveUpActions);
}
}
}

View File

@@ -45,6 +45,7 @@ if (VMD_HINTS_FILE) {
});
}
class Amd extends Emitter {
constructor(logger, cs, opts) {
super();
@@ -68,6 +69,8 @@ class Amd extends Emitter {
this.getIbmAccessToken = getIbmAccessToken;
const {setChannelVarsForStt} = require('./transcription-utils')(logger);
this.setChannelVarsForStt = setChannelVarsForStt;
this.digitCount = opts.digitCount || 0;
this.numberRegEx = RegExp(`[0-9]{${this.digitCount}}`);
const {
noSpeechTimeoutMs = 5000,
@@ -153,7 +156,7 @@ class Amd extends Emitter {
const wordCount = t.alternatives[0].transcript.split(' ').length;
const final = t.is_final;
const foundHint = hints.find((h) => t.alternatives[0].transcript.includes(h));
const foundHint = hints.find((h) => t.alternatives[0].transcript.toLowerCase().includes(h.toLowerCase()));
if (foundHint) {
/* we detected a common voice mail greeting */
this.logger.debug(`Amd:evaluateTranscription: found hint ${foundHint}`);
@@ -163,6 +166,14 @@ class Amd extends Emitter {
language: t.language_code
});
}
else if (this.digitCount != 0 && this.numberRegEx.test(t.alternatives[0].transcript)) {
/* a string of numbers is typically a machine */
this.emit(this.decision = AmdEvents.MachineDetected, {
reason: 'digit count',
greeting: t.alternatives[0].transcript,
language: t.language_code
});
}
else if (final && wordCount < this.thresholdWordCount) {
/* a short greeting is typically a human */
this.emit(this.decision = AmdEvents.HumanDetected, {
@@ -210,7 +221,8 @@ module.exports = (logger) => {
account_sid: cs.accountSid,
alert_type: AlertType.STT_FAILURE,
vendor: vendor,
detail: err.message
detail: err.message,
target_sid: cs.callSid
});
}).catch((err) => logger.info({err}, 'Error generating alert for tts failure'));
@@ -245,7 +257,10 @@ module.exports = (logger) => {
const amd = ep.amd = new Amd(logger, cs, opts);
const {vendor, language} = amd;
let sttCredentials = amd.sttCredentials;
const hints = voicemailHints[language] || [];
// hints from configuration might be too long for specific language and vendor that make transcribe freeswitch
// modules cannot connect to the vendor. hints is used in next step to validate if the transcription
// matchs voice mail hints.
const hints = [];
if (vendor === 'nuance' && sttCredentials.client_id) {
/* get nuance access token */
@@ -266,13 +281,17 @@ module.exports = (logger) => {
/* set stt options */
logger.info(`starting amd for vendor ${vendor} and language ${language}`);
const sttOpts = amd.setChannelVarsForStt({name: TaskName.Gather}, sttCredentials, language, {
vendor,
hints,
enhancedModel: true,
altLanguages: opts.recognizer?.altLanguages || [],
initialSpeechTimeoutMs: opts.resolveTimeoutMs,
});
/* if opts contains recognizer object use that config for stt, otherwise use defaults */
const rOpts = opts.recognizer ?
opts.recognizer :
{
vendor,
hints,
enhancedModel: true,
altLanguages: opts.recognizer?.altLanguages || [],
initialSpeechTimeoutMs: opts.resolveTimeoutMs,
};
const sttOpts = amd.setChannelVarsForStt({name: TaskName.Gather}, sttCredentials, language, rOpts);
await ep.set(sttOpts).catch((err) => logger.info(err, 'Error setting channel variables'));
@@ -403,7 +422,11 @@ module.exports = (logger) => {
}
if (ep.connected) {
ep.stopTranscription({vendor, bugname})
ep.stopTranscription({
vendor,
bugname,
gracefulShutdown: false
})
.catch((err) => logger.info(err, 'stopAmd: Error stopping transcription'));
task.emit('amd', {type: AmdEvents.Stopped});
ep.execute('avmd_stop').catch((err) => this.logger.info(err, 'Error stopping avmd'));

View File

@@ -4,7 +4,7 @@ const assert = require('assert');
const {
AWS_REGION,
AWS_SNS_PORT: PORT,
AWS_SNS_TOPIC_ARM,
AWS_SNS_TOPIC_ARN,
AWS_SNS_PORT_MAX,
} = require('../config');
const {LifeCycleEvents} = require('./constants');
@@ -55,12 +55,12 @@ class SnsNotifier extends Emitter {
async _handlePost(req, res) {
try {
const parsedBody = JSON.parse(req.body);
this.logger.debug({headers: req.headers, body: parsedBody}, 'Received HTTP POST from AWS');
this.logger.info({headers: req.headers, body: parsedBody}, 'Received HTTP POST from AWS');
if (!validatePayload(parsedBody)) {
this.logger.info('incoming AWS SNS HTTP POST failed signature validation');
return res.sendStatus(403);
}
this.logger.debug('incoming HTTP POST passed validation');
this.logger.info('incoming HTTP POST passed validation');
res.sendStatus(200);
switch (parsedBody.Type) {
@@ -74,7 +74,18 @@ class SnsNotifier extends Emitter {
subscriptionRequestId: this.subscriptionRequestId
}, 'response from SNS SubscribeURL');
const data = await this.describeInstance();
this.lifecycleState = data.AutoScalingGroups[0].Instances[0].LifecycleState;
const group = data.AutoScalingGroups.find((group) =>
group.Instances && group.Instances.some((instance) => instance.InstanceId === this.instanceId)
);
if (!group) {
this.logger.error('Current instance not found in any Auto Scaling group', data);
} else {
const instance = group.Instances.find((instance) => instance.InstanceId === this.instanceId);
this.lifecycleState = instance.LifecycleState;
}
//this.lifecycleState = data.AutoScalingGroups[0].Instances[0].LifecycleState;
this.emit('SubscriptionConfirmation', {publicIp: this.publicIp});
break;
@@ -94,7 +105,7 @@ class SnsNotifier extends Emitter {
this.unsubscribe();
}
else {
this.logger.debug(`SnsNotifier - instance ${msg.EC2InstanceId} is scaling in (not us)`);
this.logger.info(`SnsNotifier - instance ${msg.EC2InstanceId} is scaling in (not us)`);
}
}
break;
@@ -111,7 +122,7 @@ class SnsNotifier extends Emitter {
async init() {
try {
this.logger.debug('SnsNotifier: retrieving instance data');
this.logger.info('SnsNotifier: retrieving instance data');
this.instanceId = await getString('http://169.254.169.254/latest/meta-data/instance-id');
this.publicIp = await getString('http://169.254.169.254/latest/meta-data/public-ipv4');
this.logger.info({
@@ -142,13 +153,13 @@ class SnsNotifier extends Emitter {
try {
const params = {
Protocol: 'http',
TopicArn: AWS_SNS_TOPIC_ARM,
TopicArn: AWS_SNS_TOPIC_ARN,
Endpoint: this.snsEndpoint
};
const response = await snsClient.send(new SubscribeCommand(params));
this.logger.info({response}, `response to SNS subscribe to ${AWS_SNS_TOPIC_ARM}`);
this.logger.info({response}, `response to SNS subscribe to ${AWS_SNS_TOPIC_ARN}`);
} catch (err) {
this.logger.error({err}, `Error subscribing to SNS topic arn ${AWS_SNS_TOPIC_ARM}`);
this.logger.error({err}, `Error subscribing to SNS topic arn ${AWS_SNS_TOPIC_ARN}`);
}
}
@@ -159,9 +170,9 @@ class SnsNotifier extends Emitter {
SubscriptionArn: this.subscriptionArn
};
const response = await snsClient.send(new UnsubscribeCommand(params));
this.logger.info({response}, `response to SNS unsubscribe to ${AWS_SNS_TOPIC_ARM}`);
this.logger.info({response}, `response to SNS unsubscribe to ${AWS_SNS_TOPIC_ARN}`);
} catch (err) {
this.logger.error({err}, `Error unsubscribing to SNS topic arn ${AWS_SNS_TOPIC_ARM}`);
this.logger.error({err}, `Error unsubscribing to SNS topic arn ${AWS_SNS_TOPIC_ARN}`);
}
}

View File

@@ -46,6 +46,9 @@ class BackgroundTaskManager extends Emitter {
case 'transcribe':
task = await this._initTranscribe(opts);
break;
case 'ttsStream':
task = await this._initTtsStream(opts);
break;
default:
break;
}
@@ -62,7 +65,7 @@ class BackgroundTaskManager extends Emitter {
this.logger.info(`stopping background task: ${type}`);
task.removeAllListeners();
task.span.end();
task.kill();
task.kill(this.cs);
// Remove task from managed List
this.tasks.delete(type);
}
@@ -100,6 +103,7 @@ class BackgroundTaskManager extends Emitter {
async _initBargeIn(opts) {
let task;
try {
const copy = JSON.parse(JSON.stringify(opts));
const t = normalizeJambones(this.logger, [opts]);
task = makeTask(this.logger, t[0]);
task
@@ -118,7 +122,7 @@ class BackgroundTaskManager extends Emitter {
if (task.sticky && !this.cs.callGone && !this.cs._stopping) {
this.logger.info('BackgroundTaskManager:_initBargeIn: restarting background bargeIn');
this._bargeInHandled = false;
this.newTask('bargeIn', opts, true);
this.newTask('bargeIn', copy, true);
}
return;
})
@@ -173,6 +177,25 @@ class BackgroundTaskManager extends Emitter {
return task;
}
// Initiate Tts Stream
async _initTtsStream(opts) {
let task;
try {
const t = normalizeJambones(this.logger, [opts]);
task = makeTask(this.logger, t[0]);
const resources = await this.cs._evaluatePreconditions(task);
const {span, ctx} = this.rootSpan.startChildSpan(`background-ttsStream:${task.summary}`);
task.span = span;
task.ctx = ctx;
task.exec(this.cs, resources)
.then(this._taskCompleted.bind(this, 'ttsStream', task))
.catch(this._taskError.bind(this, 'ttsStream', task));
} catch (err) {
this.logger.info(err, 'BackgroundTaskManager:_initTtsStream - Error creating ttsStream task');
}
return task;
}
_taskCompleted(type, task) {
this.logger.debug({type, task}, `BackgroundTaskManager:_taskCompleted: task completed, sticky: ${task.sticky}`);
task.removeAllListeners();

View File

@@ -1,6 +1,7 @@
const assert = require('assert');
const Emitter = require('events');
const crypto = require('crypto');
const parseUrl = require('parse-url');
const timeSeries = require('@jambonz/time-series');
const {NODE_ENV, JAMBONES_TIME_SERIES_HOST} = require('../config');
let alerter ;
@@ -21,6 +22,10 @@ class BaseRequestor extends Emitter {
const {stats} = require('../../').srf.locals;
this.stats = stats;
const u = this._parsedUrl = parseUrl(this.url);
if (u.port) this._baseUrl = `${u.protocol}://${u.resource}:${u.port}`;
else this._baseUrl = `${u.protocol}://${u.resource}`;
if (!alerter) {
alerter = timeSeries(logger, {
host: JAMBONES_TIME_SERIES_HOST,
@@ -30,6 +35,10 @@ class BaseRequestor extends Emitter {
}
}
get baseUrl() {
return this._baseUrl;
}
get Alerter() {
return alerter;
}
@@ -70,7 +79,44 @@ class BaseRequestor extends Emitter {
return time.toFixed(0);
}
_parseHashParams(hash) {
// Remove the leading # if present
const hashString = hash.startsWith('#') ? hash.substring(1) : hash;
// Use URLSearchParams for parsing
const params = new URLSearchParams(hashString);
// Convert to a regular object
const result = {};
for (const [key, value] of params.entries()) {
result[key] = value;
}
return result;
}
/**
* Check if the error should be retried based on retry policy
* @param {Error} err - The error that occurred
* @param {string[]} rpValues - Array of retry policy values
* @returns {boolean} True if the error should be retried
*/
_shouldRetry(err, rpValues) {
// ct = connection timeout (ECONNREFUSED, ETIMEDOUT, etc)
const isCt = err.code === 'ECONNREFUSED' ||
err.code === 'ETIMEDOUT' ||
err.code === 'ECONNRESET' ||
err.code === 'ECONNABORTED';
// rt = request timeout
const isRt = err.name === 'TimeoutError';
// 4xx = client errors
const is4xx = err.statusCode >= 400 && err.statusCode < 500;
// 5xx = server errors
const is5xx = err.statusCode >= 500 && err.statusCode < 600;
// Check if error type is included in retry policy
return rpValues.includes('all') ||
(isCt && rpValues.includes('ct')) ||
(isRt && rpValues.includes('rt')) ||
(is4xx && rpValues.includes('4xx')) ||
(is5xx && rpValues.includes('5xx'));
}
}
module.exports = BaseRequestor;

View File

@@ -1,5 +1,6 @@
{
"TaskName": {
"Alert": "alert",
"Answer": "answer",
"Conference": "conference",
"Config": "config",
@@ -14,6 +15,7 @@
"Leave": "leave",
"Lex": "lex",
"Listen": "listen",
"Llm": "llm",
"Message": "message",
"Pause": "pause",
"Play": "play",
@@ -27,10 +29,11 @@
"SipRedirect": "sip:redirect",
"Say": "say",
"SayLegacy": "say:legacy",
"Stream": "stream",
"Tag": "tag",
"Transcribe": "transcribe"
},
"AllowedSipRecVerbs": ["answer", "config", "gather", "transcribe", "listen", "tag"],
"AllowedSipRecVerbs": ["answer", "config", "gather", "transcribe", "listen", "tag", "hangup", "sip:decline"],
"AllowedConfirmSessionVerbs": ["config", "gather", "plays", "say", "tag"],
"CallStatus": {
"Trying": "trying",
@@ -91,7 +94,20 @@
"DeepgramTranscriptionEvents": {
"Transcription": "deepgram_transcribe::transcription",
"ConnectFailure": "deepgram_transcribe::connect_failed",
"Connect": "deepgram_transcribe::connect"
"Connect": "deepgram_transcribe::connect",
"Error": "deepgram_transcribe::error"
},
"DeepgramfluxTranscriptionEvents": {
"Transcription": "deepgramflux_transcribe::transcription",
"ConnectFailure": "deepgramflux_transcribe::connect_failed",
"Connect": "deepgramflux_transcribe::connect",
"Error": "deepgramflux_transcribe::error"
},
"GladiaTranscriptionEvents": {
"Transcription": "gladia_transcribe::transcription",
"ConnectFailure": "gladia_transcribe::connect_failed",
"Connect": "gladia_transcribe::connect",
"Error": "gladia_transcribe::error"
},
"SonioxTranscriptionEvents": {
"Transcription": "soniox_transcribe::transcription",
@@ -126,6 +142,27 @@
"NoSpeechDetected": "azure_transcribe::no_speech_detected",
"VadDetected": "azure_transcribe::vad_detected"
},
"SpeechmaticsTranscriptionEvents": {
"Transcription": "speechmatics_transcribe::transcription",
"Translation": "speechmatics_transcribe::translation",
"Info": "speechmatics_transcribe::info",
"RecognitionStarted": "speechmatics_transcribe::recognition_started",
"ConnectFailure": "speechmatics_transcribe::connect_failed",
"Connect": "speechmatics_transcribe::connect",
"Error": "speechmatics_transcribe::error"
},
"OpenAITranscriptionEvents": {
"Transcription": "openai_transcribe::transcription",
"Translation": "openai_transcribe::translation",
"SpeechStarted": "openai_transcribe::speech_started",
"SpeechStopped": "openai_transcribe::speech_stopped",
"PartialTranscript": "openai_transcribe::partial_transcript",
"Info": "openai_transcribe::info",
"RecognitionStarted": "openai_transcribe::recognition_started",
"ConnectFailure": "openai_transcribe::connect_failed",
"Connect": "openai_transcribe::connect",
"Error": "openai_transcribe::error"
},
"JambonzTranscriptionEvents": {
"Transcription": "jambonz_transcribe::transcription",
"ConnectFailure": "jambonz_transcribe::connect_failed",
@@ -138,9 +175,30 @@
"ConnectFailure": "assemblyai_transcribe::connect_failed",
"Connect": "assemblyai_transcribe::connect"
},
"HoundifyTranscriptionEvents": {
"Transcription": "houndify_transcribe::transcription",
"Error": "houndify_transcribe::error",
"ConnectFailure": "houndify_transcribe::connect_failed",
"Connect": "houndify_transcribe::connect"
},
"VoxistTranscriptionEvents": {
"Transcription": "voxist_transcribe::transcription",
"Error": "voxist_transcribe::error",
"ConnectFailure": "voxist_transcribe::connect_failed",
"Connect": "voxist_transcribe::connect"
},
"CartesiaTranscriptionEvents": {
"Transcription": "cartesia_transcribe::transcription",
"Error": "cartesia_transcribe::error",
"ConnectFailure": "cartesia_transcribe::connect_failed",
"Connect": "cartesia_transcribe::connect"
},
"VadDetection": {
"Detection": "vad_detect:detection"
},
"SileroVadDetection": {
"Detection": "vad_silero:detect"
},
"ListenEvents": {
"Connect": "mod_audio_fork::connect",
"ConnectFailure": "mod_audio_fork::connect_failed",
@@ -158,6 +216,41 @@
"StandbyEnter": "standby-enter",
"StandbyExit": "standby-exit"
},
"LlmEvents_OpenAI": {
"Error": "error",
"Connect": "openai_s2s::connect",
"ConnectFailure": "openai_s2s::connect_failed",
"Disconnect": "openai_s2s::disconnect",
"ServerEvent": "openai_s2s::server_event"
},
"LlmEvents_Google": {
"Error": "error",
"Connect": "google_s2s::connect",
"ConnectFailure": "google_s2s::connect_failed",
"Disconnect": "google_s2s::disconnect",
"ServerEvent": "google_s2s::server_event"
},
"LlmEvents_Elevenlabs": {
"Error": "error",
"Connect": "elevenlabs_s2s::connect",
"ConnectFailure": "elevenlabs_s2s::connect_failed",
"Disconnect": "elevenlabs_s2s::disconnect",
"ServerEvent": "elevenlabs_s2s::server_event"
},
"LlmEvents_VoiceAgent": {
"Error": "error",
"Connect": "voice_agent_s2s::connect",
"ConnectFailure": "voice_agent_s2s::connect_failed",
"Disconnect": "voice_agent_s2s::disconnect",
"ServerEvent": "voice_agent_s2s::server_event"
},
"LlmEvents_Ultravox": {
"Error": "error",
"Connect": "ultravox_s2s::connect",
"ConnectFailure": "ultravox_s2s::connect_failed",
"Disconnect": "ultravox_s2s::disconnect",
"ServerEvent": "ultravox_s2s::server_event"
},
"QueueResults": {
"Bridged": "bridged",
"Error": "error",
@@ -172,7 +265,9 @@
},
"KillReason": {
"Hangup": "hangup",
"Replaced": "replaced"
"Replaced": "replaced",
"ReferComplete": "refer-complete",
"MediaTimeout": "media_timeout"
},
"HookMsgTypes": [
"session:new",
@@ -184,6 +279,10 @@
"dial:confirm",
"verb:hook",
"verb:status",
"llm:event",
"llm:tool-call",
"tts:tokens-result",
"tts:streaming-event",
"jambonz:error"
],
"RecordState": {
@@ -202,7 +301,63 @@
"ToneTimeout": "amd_tone_timeout",
"Stopped": "amd_stopped"
},
"MediaPath": {
"NoMedia": "no-media",
"PartialMedia": "partial-media",
"FullMedia": "full-media"
},
"DeepgramTtsStreamingEvents": {
"Empty": "deepgram_tts_streaming::empty",
"ConnectFailure": "deepgram_tts_streaming::connect_failed",
"Connect": "deepgram_tts_streaming::connect"
},
"CartesiaTtsStreamingEvents": {
"Empty": "cartesia_tts_streaming::empty",
"ConnectFailure": "cartesia_tts_streaming::connect_failed",
"Connect": "cartesia_tts_streaming::connect"
},
"ElevenlabsTtsStreamingEvents": {
"Empty": "elevenlabs_tts_streaming::empty",
"ConnectFailure": "elevenlabs_tts_streaming::connect_failed",
"Connect": "elevenlabs_tts_streaming::connect"
},
"RimelabsTtsStreamingEvents": {
"Empty": "rimelabs_tts_streaming::empty",
"ConnectFailure": "rimelabs_tts_streaming::connect_failed",
"Connect": "rimelabs_tts_streaming::connect"
},
"CustomTtsStreamingEvents": {
"Empty": "custom_tts_streaming::empty",
"ConnectFailure": "custom_tts_streaming::connect_failed",
"Connect": "custom_tts_streaming::connect"
},
"TtsStreamingEvents": {
"Empty": "tts_streaming::empty",
"Pause": "tts_streaming::pause",
"Resume": "tts_streaming::resume",
"ConnectFailure": "tts_streaming::connect_failed",
"Connected": "tts_streaming::connected"
},
"TtsStreamingConnectionStatus": {
"NotConnected": "not_connected",
"Connected": "connected",
"Connecting": "connecting",
"Failed": "failed"
},
"MAX_SIMRINGS": 10,
"BONG_TONE": "tone_stream://v=-7;%(100,0,941.0,1477.0);v=-7;>=2;+=.1;%(1400,0,350,440)",
"FS_UUID_SET_NAME": "fsUUIDs"
"FS_UUID_SET_NAME": "fsUUIDs",
"SystemState" : {
"Online": "ONLINE",
"Offline": "OFFLINE",
"GracefulShutdownInProgress":"SHUTDOWN_IN_PROGRESS"
},
"FEATURE_SERVER" : "feature-server",
"WS_CLOSE_CODES": {
"NormalClosure": 1000,
"GoingAway": 1001
},
"NON_FANTAL_ERRORS": [
"File Not Found"
]
}

View File

@@ -76,9 +76,20 @@ const speechMapper = (cred) => {
else if ('deepgram' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.model_id = o.model_id;
obj.deepgram_stt_uri = o.deepgram_stt_uri;
obj.deepgram_tts_uri = o.deepgram_tts_uri;
obj.deepgram_stt_use_tls = o.deepgram_stt_use_tls;
}
else if ('gladia' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.region = o.region;
}
else if ('deepgramflux' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
}
else if ('soniox' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
@@ -90,39 +101,89 @@ const speechMapper = (cred) => {
else if ('cobalt' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.cobalt_server_uri = o.cobalt_server_uri;
} else if ('elevenlabs' === obj.vendor) {
}
else if ('elevenlabs' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.model_id = o.model_id;
obj.api_uri = o.api_uri;
obj.options = o.options;
} else if ('playht' === obj.vendor) {
}
else if ('playht' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.user_id = o.user_id;
obj.voice_engine = o.voice_engine;
obj.options = o.options;
} else if ('rimelabs' === obj.vendor) {
}
else if ('cartesia' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.model_id = o.model_id;
obj.stt_model_id = o.stt_model_id;
obj.embedding = o.embedding;
obj.options = o.options;
}
else if ('rimelabs' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.model_id = o.model_id;
obj.options = o.options;
} else if ('assemblyai' === obj.vendor) {
}
else if ('resemble' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
} else if ('whisper' === obj.vendor) {
obj.resemble_tts_use_tls = o.resemble_tts_use_tls;
obj.resemble_tts_uri = o.resemble_tts_uri;
}
else if ('inworld' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.model_id = o.model_id;
} else if ('verbio' === obj.vendor) {
obj.options = o.options;
}
else if ('assemblyai' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.service_version = o.service_version;
}
else if ('houndify' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.client_id = o.client_id;
obj.client_key = o.client_key;
obj.user_id = o.user_id;
}
else if ('voxist' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
}
else if ('whisper' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.model_id = o.model_id;
}
else if ('verbio' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.client_id = o.client_id;
obj.client_secret = o.client_secret;
obj.engine_version = o.engine_version;
} else if (obj.vendor.startsWith('custom:')) {
}
else if ('speechmatics' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.speechmatics_stt_uri = o.speechmatics_stt_uri;
}
else if ('openai' === obj.vendor) {
const o = JSON.parse(decrypt(credential));
obj.api_key = o.api_key;
obj.model_id = o.model_id;
}
else if (obj.vendor.startsWith('custom:')) {
const o = JSON.parse(decrypt(credential));
obj.auth_token = o.auth_token;
obj.custom_stt_url = o.custom_stt_url;
obj.custom_tts_url = o.custom_tts_url;
obj.custom_tts_streaming_url = o.custom_tts_streaming_url;
}
} catch (err) {
console.log(err);
@@ -202,11 +263,23 @@ module.exports = (logger, srf) => {
}
};
const lookupVoipCarrierBySid = async(sid) => {
const pp = pool.promise();
try {
const [r] = await pp.query('SELECT * FROM voip_carriers WHERE voip_carrier_sid = ?', [sid]);
return r;
} catch (err) {
logger.error({err}, `lookupVoipCarrierBySid: Error ${sid}`);
}
};
return {
lookupAccountDetails,
updateSpeechCredentialLastUsed,
lookupCarrier,
lookupCarrierByPhoneNumber,
lookupGoogleCustomVoice
lookupGoogleCustomVoice,
lookupVoipCarrierBySid
};
};

33
lib/utils/error.js Normal file
View File

@@ -0,0 +1,33 @@
class NonFatalTaskError extends Error {
constructor(msg) {
super(msg);
}
}
class SpeechCredentialError extends NonFatalTaskError {
constructor(msg) {
super(msg);
}
}
class PlayFileNotFoundError extends NonFatalTaskError {
constructor(url) {
super('File not found');
this.url = url;
}
}
class HTTPResponseError extends Error {
constructor(statusCode) {
super('Unexpected HTTP Response');
delete this.stack;
this.statusCode = statusCode;
}
}
module.exports = {
SpeechCredentialError,
NonFatalTaskError,
PlayFileNotFoundError,
HTTPResponseError
};

5
lib/utils/helpers.js Normal file
View File

@@ -0,0 +1,5 @@
const sleepFor = (ms) => new Promise((resolve) => setTimeout(() => resolve(), ms));
module.exports = {
sleepFor
};

View File

@@ -16,6 +16,7 @@ const {
NODE_ENV,
HTTP_USER_AGENT_HEADER,
} = require('../config');
const {HTTPResponseError} = require('./error');
const toBase64 = (str) => Buffer.from(str || '', 'utf8').toString('base64');
@@ -40,15 +41,14 @@ class HttpRequestor extends BaseRequestor {
constructor(logger, account_sid, hook, secret) {
super(logger, account_sid, hook, secret);
this.method = hook.method || 'POST';
this.method = hook.method?.toUpperCase() || 'POST';
this.authHeader = basicAuth(hook.username, hook.password);
this.backoffMs = 500;
assert(this._isAbsoluteUrl(this.url));
assert(['GET', 'POST'].includes(this.method));
const u = this._parsedUrl = parseUrl(this.url);
if (u.port) this._baseUrl = `${u.protocol}://${u.resource}:${u.port}`;
else this._baseUrl = `${u.protocol}://${u.resource}`;
this._protocol = u.protocol;
this._resource = u.resource;
this._port = u.port;
@@ -56,18 +56,18 @@ class HttpRequestor extends BaseRequestor {
this._usePools = HTTP_POOL && parseInt(HTTP_POOL);
if (this._usePools) {
if (pools.has(this._baseUrl)) {
this.client = pools.get(this._baseUrl);
if (pools.has(this.baseUrl)) {
this.client = pools.get(this.baseUrl);
}
else {
const connections = HTTP_POOLSIZE ? parseInt(HTTP_POOLSIZE) : 10;
const pipelining = HTTP_PIPELINING ? parseInt(HTTP_PIPELINING) : 1;
const pool = this.client = new Pool(this._baseUrl, {
const pool = this.client = new Pool(this.baseUrl, {
connections,
pipelining
});
pools.set(this._baseUrl, pool);
this.logger.debug(`HttpRequestor:created pool for ${this._baseUrl}`);
pools.set(this.baseUrl, pool);
this.logger.debug(`HttpRequestor:created pool for ${this.baseUrl}`);
}
}
else {
@@ -88,10 +88,6 @@ class HttpRequestor extends BaseRequestor {
}
}
get baseUrl() {
return this._baseUrl;
}
close() {
if (!this._usePools && !this.client?.closed) this.client.close();
}
@@ -107,15 +103,15 @@ class HttpRequestor extends BaseRequestor {
* @param {string} [hook.password] - if basic auth is protecting the endpoint
* @param {object} [params] - request parameters
*/
async request(type, hook, params, httpHeaders = {}) {
async request(type, hook, params, httpHeaders = {}, span) {
/* jambonz:error only sent over ws */
if (type === 'jambonz:error') return;
assert(HookMsgTypes.includes(type));
const payload = params ? snakeCaseKeys(params, ['customerData', 'sip']) : null;
const payload = params ? snakeCaseKeys(params, ['customerData', 'sip', 'env_vars', 'args']) : null;
const url = hook.url || hook;
const method = hook.method || 'POST';
const method = hook.method?.toUpperCase() || 'POST';
let buf = '';
httpHeaders = {
...httpHeaders,
@@ -123,7 +119,7 @@ class HttpRequestor extends BaseRequestor {
};
assert.ok(url, 'HttpRequestor:request url was not provided');
assert.ok, (['GET', 'POST'].includes(method), `HttpRequestor:request method must be 'GET' or 'POST' not ${method}`);
assert.ok(['GET', 'POST'].includes(method), `HttpRequestor:request method must be 'GET' or 'POST' not ${method}`);
const startAt = process.hrtime();
/* if we have an absolute url, and it is ws then do a websocket connection */
@@ -136,30 +132,51 @@ class HttpRequestor extends BaseRequestor {
this.close();
this.emit('handover', requestor);
}
return requestor.request('session:new', hook, params, httpHeaders);
return requestor.request('session:new', hook, params, httpHeaders, span);
}
let newClient;
try {
this.backoffMs = 500;
// Parse URL and extract hash parameters for retry configuration
// Prepare request options - only do this once
const absUrl = this._isRelativeUrl(url) ? `${this.baseUrl}${url}` : url;
const parsedUrl = parseUrl(absUrl);
const hash = parsedUrl.hash || '';
const hashObj = hash ? this._parseHashParams(hash) : {};
// Retry policy: rp valid values: 4xx, 5xx, ct, rt, all, default is ct
// Retry count: rc valid values: 1-5, default is 0
// rc is the number of attempts we'll make AFTER the initial try
const rc = hash ? Math.min(Math.abs(parseInt(hashObj.rc || '0')), 5) : 0;
const rp = hashObj.rp || 'ct';
const rpValues = rp.split(',').map((v) => v.trim());
let retryCount = 0;
// Set up client, path and query parameters - only do this once
let client, path, query;
if (this._isRelativeUrl(url)) {
client = this.client;
path = url;
}
else {
const u = parseUrl(url);
if (u.resource === this._resource && u.port === this._port && u.protocol === this._protocol) {
if (parsedUrl.resource === this._resource &&
parsedUrl.port === this._port &&
parsedUrl.protocol === this._protocol) {
client = this.client;
path = u.pathname;
query = u.query;
path = parsedUrl.pathname;
query = parsedUrl.query;
}
else {
if (u.port) client = newClient = new Client(`${u.protocol}://${u.resource}:${u.port}`);
else client = newClient = new Client(`${u.protocol}://${u.resource}`);
path = u.pathname;
query = u.query;
if (parsedUrl.port) {
client = newClient = new Client(`${parsedUrl.protocol}://${parsedUrl.resource}:${parsedUrl.port}`);
}
else client = newClient = new Client(`${parsedUrl.protocol}://${parsedUrl.resource}`);
path = parsedUrl.pathname;
query = parsedUrl.query;
}
}
const sigHeader = this._generateSigHeader(payload, this.secret);
const hdrs = {
...sigHeader,
@@ -167,20 +184,8 @@ class HttpRequestor extends BaseRequestor {
...httpHeaders,
...('POST' === method && {'Content-Type': 'application/json'})
};
const absUrl = this._isRelativeUrl(url) ? `${this.baseUrl}${url}` : url;
this.logger.debug({url, absUrl, hdrs}, 'send webhook');
const {statusCode, headers, body} = HTTP_PROXY_IP ? await request(
this.baseUrl,
{
path,
query,
method,
headers: hdrs,
...('POST' === method && {body: JSON.stringify(payload)}),
timeout: HTTP_TIMEOUT,
followRedirects: false
}
) : await client.request({
const requestOptions = {
path,
query,
method,
@@ -188,15 +193,51 @@ class HttpRequestor extends BaseRequestor {
...('POST' === method && {body: JSON.stringify(payload)}),
timeout: HTTP_TIMEOUT,
followRedirects: false
});
if (![200, 202, 204].includes(statusCode)) {
const err = new Error();
err.statusCode = statusCode;
throw err;
}
if (headers['content-type']?.includes('application/json')) {
buf = await body.json();
};
// Simplified makeRequest function that just executes the HTTP request
const makeRequest = async() => {
this.logger.debug({url, absUrl, hdrs, retryCount},
`send webhook${retryCount > 0 ? ' (retry ' + retryCount + ')' : ''}`);
const {statusCode, headers, body} = HTTP_PROXY_IP ? await request(
this.baseUrl,
requestOptions
) : await client.request(requestOptions);
if (![200, 202, 204].includes(statusCode)) {
const err = new HTTPResponseError(statusCode);
throw err;
}
if (headers['content-type']?.includes('application/json')) {
return await body.json();
}
return '';
};
while (true) {
try {
buf = await makeRequest();
break; // Success, exit the retry loop
} catch (err) {
retryCount++;
// Check if we should retry
if (retryCount <= rc && this._shouldRetry(err, rpValues)) {
this.logger.info(
{err, baseUrl: this.baseUrl, url, retryCount, maxRetries: rc},
`Retrying request (${retryCount}/${rc})`
);
const delay = this.backoffMs;
this.backoffMs = this.backoffMs < 2000 ? this.backoffMs * 2 : (this.backoffMs + 2000);
await new Promise((resolve) => setTimeout(resolve, delay));
continue;
}
throw err;
}
}
if (newClient) newClient.close();
} catch (err) {
if (err.statusCode) {
@@ -225,10 +266,10 @@ class HttpRequestor extends BaseRequestor {
const rtt = this._roundTrip(startAt);
if (buf) this.stats.histogram('app.hook.response_time', rtt, ['hook_type:app']);
if (buf && Array.isArray(buf)) {
if (buf && (Array.isArray(buf) || type == 'llm:tool-call')) {
this.logger.info({response: buf}, `HttpRequestor:request ${method} ${url} succeeded in ${rtt}ms`);
return buf;
}
return buf;
}
}

View File

@@ -1,5 +1,5 @@
const Mrf = require('drachtio-fsmrf');
const ip = require('ip');
const os = require('os');
const {
JAMBONES_MYSQL_HOST,
JAMBONES_MYSQL_USER,
@@ -12,23 +12,45 @@ const {
JAMBONES_TIME_SERIES_HOST,
JAMBONES_ESL_LISTEN_ADDRESS,
PORT,
HTTP_IP,
NODE_ENV,
} = require('../config');
const Registrar = require('@jambonz/mw-registrar');
const assert = require('assert');
function initMS(logger, wrapper, ms) {
function getLocalIp() {
const interfaces = os.networkInterfaces();
for (const interfaceName in interfaces) {
const interface = interfaces[interfaceName];
for (const iface of interface) {
if (iface.family === 'IPv4' && !iface.internal) {
return iface.address;
}
}
}
return '127.0.0.1'; // Fallback to localhost if no suitable interface found
}
function initMS(logger, wrapper, ms, {
onFreeswitchConnect,
onFreeswitchDisconnect
}) {
Object.assign(wrapper, {ms, active: true, connects: 1});
logger.info(`connected to freeswitch at ${ms.address}`);
onFreeswitchConnect(wrapper);
ms.conn
.on('esl::end', () => {
wrapper.active = false;
wrapper.connects = 0;
logger.info(`lost connection to freeswitch at ${ms.address}`);
onFreeswitchDisconnect(wrapper);
ms.removeAllListeners();
})
.on('esl::ready', () => {
if (wrapper.connects > 0) {
logger.info(`connected to freeswitch at ${ms.address}`);
logger.info(`esl::ready connected to freeswitch at ${ms.address}`);
}
wrapper.connects = 1;
wrapper.active = true;
@@ -42,7 +64,10 @@ function initMS(logger, wrapper, ms) {
});
}
function installSrfLocals(srf, logger) {
function installSrfLocals(srf, logger, {
onFreeswitchConnect = () => {},
onFreeswitchDisconnect = () => {}
}) {
logger.debug('installing srf locals');
assert(!srf.locals.dbHelpers);
const {tracer} = srf.locals.otel;
@@ -77,7 +102,10 @@ function installSrfLocals(srf, logger) {
mediaservers.push(val);
try {
const ms = await mrf.connect(fs);
initMS(logger, val, ms);
initMS(logger, val, ms, {
onFreeswitchConnect,
onFreeswitchDisconnect
});
}
catch (err) {
logger.info({err}, `failed connecting to freeswitch at ${fs.address}, will retry shortly: ${err.message}`);
@@ -88,9 +116,15 @@ function installSrfLocals(srf, logger) {
for (const val of mediaservers) {
if (val.connects === 0) {
try {
// make sure all listeners are removed before reconnecting
val.ms?.disconnect();
val.ms = null;
logger.info({mediaserver: val.opts}, 'Retrying initial connection to media server');
const ms = await mrf.connect(val.opts);
initMS(logger, val, ms);
initMS(logger, val, ms, {
onFreeswitchConnect,
onFreeswitchDisconnect
});
} catch (err) {
logger.info({err}, `failed connecting to freeswitch at ${val.opts.address}, will retry shortly`);
}
@@ -138,7 +172,9 @@ function installSrfLocals(srf, logger) {
lookupAccountBySid,
lookupAccountCapacitiesBySid,
lookupSmppGateways,
lookupClientByAccountAndUsername
lookupClientByAccountAndUsername,
lookupSystemInformation,
lookupLcrByAccount
} = require('@jambonz/db-helpers')({
host: JAMBONES_MYSQL_HOST,
user: JAMBONES_MYSQL_USER,
@@ -184,7 +220,8 @@ function installSrfLocals(srf, logger) {
} = require('@jambonz/speech-utils')({}, logger);
const {
writeAlerts,
AlertType
AlertType,
writeSystemAlerts
} = require('@jambonz/time-series')(logger, {
host: JAMBONES_TIME_SERIES_HOST,
commitSize: 50,
@@ -193,7 +230,8 @@ function installSrfLocals(srf, logger) {
let localIp;
try {
localIp = ip.address();
// Either use the configured IP address or discover it
localIp = HTTP_IP || getLocalIp();
} catch (err) {
logger.error({err}, 'installSrfLocals - error detecting local ipv4 address');
}
@@ -213,6 +251,7 @@ function installSrfLocals(srf, logger) {
lookupAccountCapacitiesBySid,
lookupSmppGateways,
lookupClientByAccountAndUsername,
lookupSystemInformation,
updateCallStatus,
retrieveCall,
listCalls,
@@ -241,7 +280,8 @@ function installSrfLocals(srf, logger) {
retrieveByPatternSortedSet,
sortedSetLength,
sortedSetPositionByPattern,
getVerbioAccessToken
getVerbioAccessToken,
lookupLcrByAccount
},
parentLogger: logger,
getSBC,
@@ -252,7 +292,8 @@ function installSrfLocals(srf, logger) {
getFreeswitch,
stats: stats,
writeAlerts,
AlertType
AlertType,
writeSystemAlerts
};
if (localIp) {

103
lib/utils/llm-mcp.js Normal file
View File

@@ -0,0 +1,103 @@
const { Client } = require('@modelcontextprotocol/sdk/client/index.js');
class LlmMcpService {
constructor(logger, mcpServers) {
this.logger = logger;
this.mcpServers = mcpServers || [];
this.mcpClients = [];
}
// make sure we call init() before using any of the mcp clients
// this is to ensure that we have a valid connection to the MCP server
// and that we have collected the available tools.
async init() {
if (this.mcpClients.length > 0) {
return;
}
const { SSEClientTransport } = await import('@modelcontextprotocol/sdk/client/sse.js');
for (const server of this.mcpServers) {
const { url } = server;
if (url) {
try {
const transport = new SSEClientTransport(new URL(url), {});
const client = new Client({ name: 'Jambonz MCP Client', version: '1.0.0' });
await client.connect(transport);
// collect available tools
const { tools } = await client.listTools();
this.mcpClients.push({
url,
client,
tools
});
} catch (err) {
this.logger.error(`LlmMcpService: Failed to connect to MCP server at ${url}: ${err.message}`);
}
}
}
}
async getAvailableMcpTools() {
// returns a list of available tools from all MCP clients
const tools = [];
for (const mcpClient of this.mcpClients) {
const {tools: availableTools} = mcpClient;
if (availableTools) {
tools.push(...availableTools);
}
}
return tools;
}
async getMcpClientByToolName(name) {
for (const mcpClient of this.mcpClients) {
const { tools } = mcpClient;
if (tools && tools.some((tool) => tool.name === name)) {
return mcpClient.client;
}
}
return null;
}
async getMcpClientByToolId(id) {
for (const mcpClient of this.mcpClients) {
const { tools } = mcpClient;
if (tools && tools.some((tool) => tool.id === id)) {
return mcpClient.client;
}
}
return null;
}
async callMcpTool(name, input) {
const client = await this.getMcpClientByToolName(name);
if (client) {
try {
const result = await client.callTool({
name,
arguments: input,
});
this.logger.debug({result}, 'LlmMcpService - result');
return result;
} catch (err) {
this.logger.error({err}, 'LlmMcpService - error calling tool');
throw err;
}
}
}
async close() {
for (const mcpClient of this.mcpClients) {
const { client } = mcpClient;
if (client) {
await client.close();
this.logger.debug({url: mcpClient.url}, 'LlmMcpService - mcp client closed');
}
}
this.mcpClients = [];
}
}
module.exports = LlmMcpService;

115
lib/utils/media-endpoint.js Normal file
View File

@@ -0,0 +1,115 @@
const {
JAMBONES_USE_FREESWITCH_TIMER_FD,
JAMBONES_MEDIA_TIMEOUT_MS,
JAMBONES_MEDIA_HOLD_TIMEOUT_MS,
JAMBONES_TRANSCRIBE_EP_DESTROY_DELAY_MS,
} = require('../config');
const { sleepFor } = require('./helpers');
const createMediaEndpoint = async(srf, logger, {
activeMs,
drachtioFsmrfOptions = {},
onHoldMusic,
inbandDtmfEnabled,
mediaTimeoutHandler,
} = {}) => {
const { getFreeswitch } = srf.locals;
const ms = activeMs || getFreeswitch();
if (!ms)
throw new Error('no available Freeswitch for creating media endpoint');
const ep = await ms.createEndpoint(drachtioFsmrfOptions);
// Configure the endpoint
const opts = {
...(onHoldMusic && {holdMusic: `shout://${onHoldMusic.replace(/^https?:\/\//, '')}`}),
...(JAMBONES_USE_FREESWITCH_TIMER_FD && {timer_name: 'timerfd'}),
...(JAMBONES_MEDIA_TIMEOUT_MS && {media_timeout: JAMBONES_MEDIA_TIMEOUT_MS}),
...(JAMBONES_MEDIA_HOLD_TIMEOUT_MS && {media_hold_timeout: JAMBONES_MEDIA_HOLD_TIMEOUT_MS})
};
if (Object.keys(opts).length > 0) {
ep.set(opts);
}
// inbandDtmfEnabled
if (inbandDtmfEnabled) {
// https://developer.signalwire.com/freeswitch/FreeSWITCH-Explained/Modules/mod-dptools/6587132/#0-about
ep.execute('start_dtmf').catch((err) => {
logger.error('Error starting inband DTMF', { error: err });
});
ep.inbandDtmfEnabled = true;
}
// Handle Media Timeout
if (mediaTimeoutHandler) {
ep.once('destroy', (evt) => {
mediaTimeoutHandler(evt, ep);
});
}
// Handle graceful shutdown for endpoint if required
if (JAMBONES_TRANSCRIBE_EP_DESTROY_DELAY_MS > 0) {
const getEpGracefulShutdownPromise = () => {
if (!ep.gracefulShutdownPromise) {
ep.gracefulShutdownPromise = new Promise((resolve) => {
// this resolver will be called when stt task received transcription.
ep.gracefulShutdownResolver = () => {
resolve();
ep.gracefulShutdownPromise = null;
};
});
}
return ep.gracefulShutdownPromise;
};
const gracefulShutdownHandler = async() => {
// resolve when one of the following happens:
// 1. stt task received transcription
// 2. JAMBONES_TRANSCRIBE_EP_DESTROY_DELAY_MS passed
await Promise.race([
getEpGracefulShutdownPromise(),
sleepFor(JAMBONES_TRANSCRIBE_EP_DESTROY_DELAY_MS)
]);
};
const origStartTranscription = ep.startTranscription.bind(ep);
ep.startTranscription = async(...args) => {
try {
const result = await origStartTranscription(...args);
ep.isTranscribeActive = true;
return result;
} catch (err) {
ep.isTranscribeActive = false;
throw err;
}
};
const origStopTranscription = ep.stopTranscription.bind(ep);
ep.stopTranscription = async(opts = {}, ...args) => {
const { gracefulShutdown = true, ...others } = opts;
if (ep.isTranscribeActive && gracefulShutdown) {
// only wait for graceful shutdown if transcription is active
await gracefulShutdownHandler();
}
try {
const result = await origStopTranscription({...others}, ...args);
ep.isTranscribeActive = false;
return result;
} catch (err) {
ep.isTranscribeActive = false;
throw err;
}
};
const origDestroy = ep.destroy.bind(ep);
ep.destroy = async() => {
if (ep.isTranscribeActive) {
await gracefulShutdownHandler();
}
return await origDestroy();
};
}
return ep;
};
module.exports = {
createMediaEndpoint,
};

32
lib/utils/network.js Normal file
View File

@@ -0,0 +1,32 @@
/**
* Parses a list of hostport entries and selects the first one that matches the specified protocol,
* excluding any entries with the localhost IP address ('127.0.0.1').
*
* Each hostport entry should be in the format: 'protocol/ip:port'
*
* @param {Object} logger - A logging object with a 'debug' method for logging debug messages.
* @param {string} hostport - A comma-separated string containing hostport entries.
* @param {string} protocol - The protocol to match (e.g., 'udp', 'tcp').
* @returns {Array} An array containing:
* 0: protocol
* 1: ip address
* 2: port
*/
const selectHostPort = (logger, hostport, protocol) => {
logger.debug(`selectHostPort: ${hostport}, ${protocol}`);
const sel = hostport
.split(',')
.map((hp) => {
const arr = /(.*)\/(.*):(.*)/.exec(hp);
return [arr[1], arr[2], arr[3]];
})
.filter((hp) => {
return hp[0] === protocol && hp[1] !== '127.0.0.1';
});
return sel[0];
};
module.exports = {
selectHostPort
};

View File

@@ -1,5 +1,5 @@
const Emitter = require('events');
const {CallStatus} = require('./constants');
const {CallStatus, MediaPath} = require('./constants');
const SipError = require('drachtio-srf').SipError;
const {TaskPreconditions, CallDirection} = require('../utils/constants');
const CallInfo = require('../session/call-info');
@@ -12,17 +12,15 @@ const deepcopy = require('deepcopy');
const moment = require('moment');
const stripCodecs = require('./strip-ancillary-codecs');
const RootSpan = require('./call-tracer');
const uuidv4 = require('uuid-random');
const crypto = require('crypto');
const HttpRequestor = require('./http-requestor');
const WsRequestor = require('./ws-requestor');
const {makeOpusFirst} = require('./sdp-utils');
const {
JAMBONES_USE_FREESWITCH_TIMER_FD
} = require('../config');
const {makeOpusFirst, removeVideoSdp} = require('./sdp-utils');
const { createMediaEndpoint } = require('./media-endpoint');
class SingleDialer extends Emitter {
constructor({logger, sbcAddress, target, opts, application, callInfo, accountInfo, rootSpan, startSpan, dialTask,
onHoldMusic}) {
onHoldMusic, tmpFiles}) {
super();
assert(target.type);
@@ -43,9 +41,10 @@ class SingleDialer extends Emitter {
this.callGone = false;
this.callSid = uuidv4();
this.callSid = crypto.randomUUID();
this.dialTask = dialTask;
this.onHoldMusic = onHoldMusic;
this.tmpFiles = tmpFiles;
this.on('callStatusChange', this._notifyCallStatusChange.bind(this));
}
@@ -94,6 +93,7 @@ class SingleDialer extends Emitter {
};
}
this.ms = ms;
this.srf = srf;
let uri, to, inviteSpan;
try {
switch (this.target.type) {
@@ -135,8 +135,7 @@ class SingleDialer extends Emitter {
this.updateCallStatus = srf.locals.dbHelpers.updateCallStatus;
this.serviceUrl = srf.locals.serviceUrl;
this.ep = await ms.createEndpoint();
this._configMsEndpoint();
this.ep = await this._createMediaEndpoint();
this.logger.debug(`SingleDialer:exec - created endpoint ${this.ep.uuid}`);
/**
@@ -150,15 +149,21 @@ class SingleDialer extends Emitter {
return;
}
let lastSdp;
const connectStream = async(remoteSdp) => {
const connectStream = async(remoteSdp, isVideoCall) => {
if (remoteSdp === lastSdp) return;
if (process.env.JAMBONES_VIDEO_CALLS_ENABLED_IN_FS && !isVideoCall) {
remoteSdp = removeVideoSdp(remoteSdp);
}
lastSdp = remoteSdp;
return this.ep.modify(remoteSdp);
};
let localSdp = this.ep.local.sdp;
if (process.env.JAMBONES_VIDEO_CALLS_ENABLED_IN_FS && !opts.isVideoCall) {
localSdp = removeVideoSdp(localSdp);
}
Object.assign(opts, {
proxy: `sip:${this.sbcAddress}`,
localSdp: opts.opusFirst ? makeOpusFirst(this.ep.local.sdp) : this.ep.local.sdp
localSdp: opts.opusFirst ? makeOpusFirst(localSdp) : localSdp
});
if (this.target.auth) opts.auth = this.target.auth;
inviteSpan = this.startSpan('invite', {
@@ -213,18 +218,20 @@ class SingleDialer extends Emitter {
},
cbProvisional: (prov) => {
const status = {sipStatus: prov.status, sipReason: prov.reason};
// Update call-id for sbc outbound INVITE
this.callInfo.sbcCallid = prov.get('X-CID');
if ([180, 183].includes(prov.status) && prov.body) {
if (status.callStatus !== CallStatus.EarlyMedia) {
status.callStatus = CallStatus.EarlyMedia;
this.emit('earlyMedia');
}
connectStream(prov.body);
connectStream(prov.body, opts.isVideoCall);
}
else status.callStatus = CallStatus.Ringing;
this.emit('callStatusChange', status);
}
});
await connectStream(this.dlg.remote.sdp);
await connectStream(this.dlg.remote.sdp, opts.isVideoCall);
this.dlg.callSid = this.callSid;
this.inviteInProgress = null;
this.emit('callStatusChange', {
@@ -267,7 +274,12 @@ class SingleDialer extends Emitter {
this.logger.info('dial is onhold, emit event');
this.emit('reinvite', req, res);
} else {
const newSdp = await this.ep.modify(req.body);
let newSdp = await this.ep.modify(req.body);
// in case of reINVITE if video call is enabled in FS and the call is not a video call,
// remove video media from the SDP
if (process.env.JAMBONES_VIDEO_CALLS_ENABLED_IN_FS && !this.opts?.isVideoCall) {
newSdp = removeVideoSdp(newSdp);
}
res.send(200, {body: newSdp});
this.logger.info({offer: req.body, answer: newSdp}, 'SingleDialer:exec: handling reINVITE');
}
@@ -315,14 +327,25 @@ class SingleDialer extends Emitter {
/**
* kill the call in progress or the stable dialog, whichever we have
*/
async kill() {
async kill(Reason) {
this.killed = true;
if (this.inviteInProgress) await this.inviteInProgress.cancel();
if (this.inviteInProgress) {
try {
await this.inviteInProgress.cancel();
} catch (err) {
this.logger.error({err}, 'SingleDialer:kill error cancelling invite');
}
}
else if (this.dlg && this.dlg.connected) {
const duration = moment().diff(this.dlg.connectTime, 'seconds');
this.logger.debug('SingleDialer:kill hanging up called party');
this.emit('callStatusChange', {callStatus: CallStatus.Completed, duration});
this.dlg.destroy();
const headers = {
...(Reason && {'X-Reason': Reason})
};
this.dlg.destroy({
headers
});
}
if (this.ep) {
this.logger.debug(`SingleDialer:kill - deleting endpoint ${this.ep.uuid}`);
@@ -330,14 +353,19 @@ class SingleDialer extends Emitter {
}
}
_configMsEndpoint() {
const opts = {
...(this.onHoldMusic && {holdMusic: `shout://${this.onHoldMusic.replace(/^https?:\/\//, '')}`}),
...(JAMBONES_USE_FREESWITCH_TIMER_FD && {timer_name: 'timerfd'})
};
if (Object.keys(opts).length > 0) {
this.ep.set(opts);
}
async _handleMediaTimeout(evt, ep) {
this.logger.info({evt}, 'SingleDialer:_handleMediaTimeout - media timeout event received');
this.dialTask.kill(this.dialTask.cs, 'media-timeout');
}
async _createMediaEndpoint(drachtioFsmrfOptions = {}) {
return await createMediaEndpoint(this.srf, this.logger, {
acactiveMs: this.ms,
drachtioFsmrfOptions,
onHoldMusic: this.onHoldMusic,
inbandDtmfEnabled: this.dialTask?.inbandDtmfEnabled,
mediaTimeoutHandler: this._handleMediaTimeout.bind(this),
});
}
/**
@@ -379,7 +407,9 @@ class SingleDialer extends Emitter {
callInfo: this.callInfo,
accountInfo: this.accountInfo,
tasks,
rootSpan: this.rootSpan
rootSpan: this.rootSpan,
req: this.req,
tmpFiles: this.tmpFiles,
});
await cs.exec();
@@ -388,7 +418,10 @@ class SingleDialer extends Emitter {
} catch (err) {
this.logger.debug(err, 'SingleDialer:_executeApp: error');
this.emit('decline');
if (this.dlg.connected) this.dlg.destroy();
if (this.dlg.connected) {
this.dlg.destroy();
this.ep.destroy();
}
}
}
@@ -441,38 +474,49 @@ class SingleDialer extends Emitter {
});
app.requestor.request('session:adulting', '/adulting', {
...cs.callInfo.toJSON(),
parentCallInfo: this.parentCallInfo
parentCallInfo: this.parentCallInfo.toJSON()
}).catch((err) => {
newLogger.error({err}, 'doAdulting: error sending adulting request');
});
cs.req = this.req;
// fixed hangup an adulting session does not send status callback Completed
cs.wrapDialog(this.dlg);
cs.exec().catch((err) => newLogger.error({err}, 'doAdulting: error executing session'));
return cs;
}
async releaseMediaToSBC(remoteSdp, localSdp) {
async releaseMediaToSBC(remoteSdp, localSdp, releaseMediaEntirely) {
assert(this.dlg && this.dlg.connected && this.ep && typeof remoteSdp === 'string');
const sdp = stripCodecs(this.logger, remoteSdp, localSdp) || remoteSdp;
await this.dlg.modify(sdp, {
headers: {
'X-Reason': 'release-media'
'X-Reason': releaseMediaEntirely ? 'release-media-entirely' : 'release-media'
}
});
this.ep.destroy()
.then(() => this.ep = null)
.catch((err) => this.logger.error({err}, 'SingleDialer:releaseMediaToSBC: Error destroying endpoint'));
try {
await this.ep.destroy();
} catch (err) {
this.logger.error({err}, 'SingleDialer:releaseMediaToSBC: Error destroying endpoint');
}
this.ep = null;
}
async reAnchorMedia() {
async reAnchorMedia(currentMediaRoute = MediaPath.PartialMedia) {
assert(this.dlg && this.dlg.connected && !this.ep);
this.ep = await this.ms.createEndpoint({remoteSdp: this.dlg.remote.sdp});
this._configMsEndpoint();
this.logger.debug('SingleDialer:reAnchorMedia: re-anchoring media after partial media');
this.ep = await this._createMediaEndpoint({remoteSdp: this.dlg.remote.sdp});
await this.dlg.modify(this.ep.local.sdp, {
headers: {
'X-Reason': 'anchor-media'
}
});
if (currentMediaRoute === MediaPath.NoMedia) {
this.logger.debug('SingleDialer:reAnchorMedia: repoint endpoint after no media');
await this.ep.modify(this.dlg.remote.sdp);
}
}
_notifyCallStatusChange({callStatus, sipStatus, sipReason, duration}) {
@@ -499,11 +543,12 @@ class SingleDialer extends Emitter {
function placeOutdial({
logger, srf, ms, sbcAddress, target, opts, application, callInfo, accountInfo, rootSpan, startSpan, dialTask,
onHoldMusic
onHoldMusic, tmpFiles
}) {
const myOpts = deepcopy(opts);
const sd = new SingleDialer({
logger, sbcAddress, target, myOpts, application, callInfo, accountInfo, rootSpan, startSpan, dialTask, onHoldMusic
logger, sbcAddress, target, opts: myOpts, application, callInfo,
accountInfo, rootSpan, startSpan, dialTask, onHoldMusic, tmpFiles
});
sd.exec(srf, ms, myOpts);
return sd;

View File

@@ -0,0 +1,91 @@
// lib/utils/process-monitor.js
const fs = require('fs');
const path = require('path');
class ProcessMonitor {
constructor(logger) {
this.logger = logger;
this.packageInfo = this.getPackageInfo();
this.processName = this.packageInfo.name || 'unknown-app';
}
getPackageInfo() {
try {
const packagePath = path.join(process.cwd(), 'package.json');
return JSON.parse(fs.readFileSync(packagePath, 'utf8'));
} catch (e) {
return { name: 'unknown', version: 'unknown' };
}
}
logStartup(additionalInfo = {}) {
const startupInfo = {
msg: `${this.processName} started`,
app_name: this.processName,
app_version: this.packageInfo.version,
pid: process.pid,
ppid: process.ppid,
pm2_instance_id: process.env.NODE_APP_INSTANCE || 'not_pm2',
pm2_id: process.env.pm_id,
is_pm2: !!process.env.PM2,
node_version: process.version,
uptime: process.uptime(),
timestamp: new Date().toISOString(),
...additionalInfo
};
this.logger.info(startupInfo);
return startupInfo;
}
setupSignalHandlers() {
// Log when we receive signals that would cause restart
process.on('SIGINT', () => {
this.logger.info({
msg: 'SIGINT received',
app_name: this.processName,
pid: process.pid,
ppid: process.ppid,
uptime: process.uptime(),
timestamp: new Date().toISOString()
});
process.exit(0);
});
process.on('SIGTERM', () => {
this.logger.info({
msg: 'SIGTERM received',
app_name: this.processName,
pid: process.pid,
ppid: process.ppid,
uptime: process.uptime(),
timestamp: new Date().toISOString()
});
process.exit(0);
});
process.on('uncaughtException', (error) => {
this.logger.error({
msg: 'Uncaught exception - process will restart',
app_name: this.processName,
error: error.message,
stack: error.stack,
pid: process.pid,
timestamp: new Date().toISOString()
});
process.exit(1);
});
process.on('unhandledRejection', (reason, promise) => {
this.logger.error({
msg: 'Unhandled rejection',
app_name: this.processName,
reason,
pid: process.pid,
timestamp: new Date().toISOString()
});
});
}
}
module.exports = ProcessMonitor;

View File

@@ -1,5 +1,5 @@
const assert = require('assert');
const uuidv4 = require('uuid-random');
const crypto = require('crypto');
const {LifeCycleEvents, FS_UUID_SET_NAME} = require('./constants');
const Emitter = require('events');
const debug = require('debug')('jambonz:feature-server');
@@ -8,7 +8,7 @@ const {
JAMBONES_SBCS,
K8S,
K8S_SBC_SIP_SERVICE_NAME,
AWS_SNS_TOPIC_ARM,
AWS_SNS_TOPIC_ARN,
OPTIONS_PING_INTERVAL,
AWS_REGION,
NODE_ENV,
@@ -35,7 +35,7 @@ module.exports = (logger) => {
// listen for SNS lifecycle changes
let lifecycleEmitter = new Emitter();
let dryUpCalls = false;
if (AWS_SNS_TOPIC_ARM && AWS_REGION) {
if (AWS_SNS_TOPIC_ARN && AWS_REGION) {
(async function() {
try {
@@ -46,12 +46,24 @@ module.exports = (logger) => {
const {srf} = require('../..');
srf.locals.publicIp = publicIp;
})
.on(LifeCycleEvents.ScaleIn, () => {
.on(LifeCycleEvents.ScaleIn, async() => {
logger.info('AWS scale-in notification: begin drying up calls');
dryUpCalls = true;
lifecycleEmitter.operationalState = LifeCycleEvents.ScaleIn;
const {srf} = require('../..');
const {writeSystemAlerts} = srf.locals;
if (writeSystemAlerts) {
const {SystemState, FEATURE_SERVER} = require('./constants');
await writeSystemAlerts({
system_component: FEATURE_SERVER,
state : SystemState.GracefulShutdownInProgress,
fields : {
detail: `feature-server with process_id ${process.pid} shutdown in progress`,
host: srf.locals?.ipv4
}
});
}
pingProxies(srf);
// if we have zero calls, we can complete the scale-in right
@@ -118,7 +130,7 @@ module.exports = (logger) => {
logger.info('disabling OPTIONS pings since we are running as a kubernetes service');
const {srf} = require('../..');
const {addToSet} = srf.locals.dbHelpers;
const uuid = srf.locals.fsUUID = uuidv4();
const uuid = srf.locals.fsUUID = crypto.randomUUID();
/* in case redis is restarted, re-insert our key every so often */
setInterval(() => {

View File

@@ -35,6 +35,12 @@ const makeOpusFirst = (sdp) => {
}
return sdpTransform.write(parsedSdp);
};
const removeVideoSdp = (sdp) => {
const parsedSdp = sdpTransform.parse(sdp);
// Filter out video media sections, keeping only non-video media
parsedSdp.media = parsedSdp.media.filter((media) => media.type !== 'video');
return sdpTransform.write(parsedSdp);
};
const extractSdpMedia = (sdp) => {
const parsedSdp1 = sdpTransform.parse(sdp);
@@ -49,10 +55,28 @@ const extractSdpMedia = (sdp) => {
}
};
const getLeadingCodec = (sdp) => {
if (!sdp) {
return null;
}
const parsed = sdpTransform.parse(sdp);
const audio = parsed.media?.find((m) => m.type === 'audio');
if (!audio) {
return null;
}
return audio.rtp?.[0]?.codec || null;
};
module.exports = {
isOnhold,
mergeSdpMedia,
extractSdpMedia,
isOpusFirst,
makeOpusFirst
makeOpusFirst,
removeVideoSdp,
getLeadingCodec
};

View File

@@ -1,5 +1,5 @@
const xmlParser = require('xml2js').parseString;
const uuidv4 = require('uuid-random');
const crypto = require('crypto');
const parseUri = require('drachtio-srf').parseUri;
const transform = require('sdp-transform');
const debug = require('debug')('jambonz:feature-server');
@@ -52,7 +52,7 @@ const parseSiprecPayload = (req, logger) => {
const arr = /^([^]+)(m=[^]+?)(m=[^]+?)$/.exec(sdp);
opts.sdp1 = `${arr[1]}${arr[2]}`;
opts.sdp2 = `${arr[1]}${arr[3]}\r\n`;
opts.sessionId = uuidv4();
opts.sessionId = crypto.randomUUID();
logger.info({ payload: req.payload }, 'SIPREC payload with no metadata (e.g. Cisco NBR)');
resolve(opts);
} else if (!sdp || !meta) {
@@ -64,7 +64,7 @@ const parseSiprecPayload = (req, logger) => {
if (err) { throw err; }
opts.recordingData = result ;
opts.sessionId = uuidv4() ;
opts.sessionId = crypto.randomUUID();
const arr = /^([^]+)(m=[^]+?)(m=[^]+?)$/.exec(sdp) ;
opts.sdp1 = `${arr[1]}${arr[2]}` ;

View File

@@ -0,0 +1,74 @@
const EventEmitter = require('events');
/**
* A specialized EventEmitter that caches the most recent event emissions.
* When new listeners are added, they immediately receive the most recent
* event if it was previously emitted. This is useful for handling state
* changes where late subscribers need to know the current state.
*
* Features:
* - Caches the most recent emission for each event type
* - New listeners immediately receive the cached event if available
* - Supports both regular (on) and one-time (once) listeners
* - Maintains compatibility with Node's EventEmitter interface
*/
class StickyEventEmitter extends EventEmitter {
constructor() {
super();
this._eventCache = new Map();
this._onceListeners = new Map(); // For storing once listeners if needed
}
destroy() {
this._eventCache.clear();
this._onceListeners.clear();
this.removeAllListeners();
}
emit(event, ...args) {
// Store the event and its args
this._eventCache.set(event, args);
// If there are any 'once' listeners waiting, call them
if (this._onceListeners.has(event)) {
const listeners = this._onceListeners.get(event);
for (const listener of listeners) {
listener(...args);
}
if (this.onSuccess) {
this.onSuccess();
}
this._onceListeners.delete(event);
// return from here as the event listener is already called
// this is to avoid calling the native emit method which
// will call the event listener again
return true;
}
return super.emit(event, ...args);
}
on(event, listener) {
if (this._eventCache.has(event)) {
listener(...this._eventCache.get(event));
}
return super.on(event, listener);
}
once(event, listener) {
if (this._eventCache.has(event)) {
listener(...this._eventCache.get(event));
if (this.onSuccess) {
this.onSuccess();
}
} else {
// Store listener in case emit comes before
if (!this._onceListeners.has(event)) {
this._onceListeners.set(event, []);
}
this._onceListeners.get(event).push(listener);
super.once(event, listener); // Also attach to native once
}
return this;
}
}
module.exports = StickyEventEmitter;

View File

@@ -0,0 +1,197 @@
const { assert } = require('console');
const Emitter = require('events');
const {
VadDetection,
SileroVadDetection
} = require('../utils/constants.json');
class SttLatencyCalculator extends Emitter {
constructor({ logger, cs}) {
super();
this.logger = logger;
this.cs = cs;
this.isRunning = false;
this.isInTalkSpurt = false;
this.start_talking_time = 0;
this.talkspurts = [];
this.vendor = this.cs.vad?.vendor || 'silero';
this.stt_start_time = 0;
this.stt_stop_time = 0;
this.stt_on_transcription_time = 0;
}
set sttStartTime(time) {
this.stt_start_time = time;
}
get sttStartTime() {
return this.stt_start_time || 0;
}
set sttStopTime(time) {
this.stt_stop_time = time;
}
get sttStopTime() {
return this.stt_stop_time || 0;
}
set sttOnTranscriptionTime(time) {
this.stt_on_transcription_time = time;
}
get sttOnTranscriptionTime() {
return this.stt_on_transcription_time || 0;
}
_onVadDetected(_ep, _evt, fsEvent) {
if (fsEvent.getHeader('detected-event') === 'stop_talking') {
if (this.isInTalkSpurt) {
this.talkspurts.push({
start: this.start_talking_time,
stop: Date.now()
});
}
this.start_talking_time = 0;
this.isInTalkSpurt = false;
} else if (fsEvent.getHeader('detected-event') === 'start_talking') {
this.start_talking_time = Date.now();
this.isInTalkSpurt = true;
}
}
_startVad() {
assert(!this.isRunning, 'Latency calculator is already running');
assert(this.cs.ep, 'Callsession has no endpoint to start the latency calculator');
const ep = this.cs.ep;
if (!ep.sttLatencyVadHandler) {
ep.sttLatencyVadHandler = this._onVadDetected.bind(this, ep);
if (this.vendor === 'silero') {
ep.addCustomEventListener(SileroVadDetection.Detection, ep.sttLatencyVadHandler);
} else {
ep.addCustomEventListener(VadDetection.Detection, ep.sttLatencyVadHandler);
}
}
this.stop_talking_time = 0;
this.start_talking_time = 0;
this.vad = {
...(this.cs.vad || {}),
strategy: 'continuous',
bugname: 'stt-latency-calculator-vad',
vendor: this.vendor
};
ep.startVadDetection(this.vad);
this.isRunning = true;
}
_stopVad() {
if (this.isRunning) {
this.logger.warn('Latency calculator is still running, stopping VAD detection');
const ep = this.cs.ep;
ep.stopVadDetection(this.vad);
if (ep.sttLatencyVadHandler) {
if (this.vendor === 'silero') {
this.ep?.removeCustomEventListener(SileroVadDetection.Detection, ep.sttLatencyVadHandler);
} else {
this.ep?.removeCustomEventListener(VadDetection.Detection, ep.sttLatencyVadHandler);
}
ep.sttLatencyVadHandler = null;
}
this.isRunning = false;
this.logger.info('STT Latency Calculator stopped');
}
}
start() {
if (this.isRunning) {
this.logger.warn('Latency calculator is already running');
return;
}
if (!this.cs.ep) {
this.logger.error('Callsession has no endpoint to start the latency calculator');
return;
}
this._startVad();
this.logger.debug('STT Latency Calculator started');
}
stop() {
this._stopVad();
}
toUnixTimestamp(date) {
return Math.floor(date / 1000);
}
calculateLatency() {
if (!this.isRunning) {
this.logger.debug('Latency calculator is not running, cannot calculate latency, returning default values');
return null;
}
const stt_stop_time = this.stt_stop_time || Date.now();
if (this.isInTalkSpurt) {
this.talkspurts.push({
start: this.start_talking_time,
stop: stt_stop_time
});
this.isInTalkSpurt = false;
this.start_talking_time = 0;
}
const stt_on_transcription_time = this.stt_on_transcription_time || stt_stop_time;
const start_talking_time = this.talkspurts[0]?.start;
let lastIdx = this.talkspurts.length - 1;
lastIdx = lastIdx < 0 ? 0 : lastIdx;
const stop_talking_time = this.talkspurts[lastIdx]?.stop || stt_stop_time;
return {
stt_start_time: this.toUnixTimestamp(this.stt_start_time),
stt_stop_time: this.toUnixTimestamp(stt_stop_time),
start_talking_time: this.toUnixTimestamp(start_talking_time),
stop_talking_time: this.toUnixTimestamp(stop_talking_time),
stt_latency: parseFloat((Math.abs(stt_on_transcription_time - stop_talking_time)) / 1000).toFixed(2),
stt_latency_ms: Math.abs(stt_on_transcription_time - stop_talking_time),
stt_usage: parseFloat((stt_stop_time - this.stt_start_time) / 1000).toFixed(2),
talkspurts: this.talkspurts.map((ts) =>
([this.toUnixTimestamp(ts.start || 0), this.toUnixTimestamp(ts.stop || 0)]))
};
}
resetTime() {
if (!this.isRunning) {
return;
}
this.stt_start_time = Date.now();
this.stt_stop_time = 0;
this.stt_on_transcription_time = 0;
this.clearTalkspurts();
this.logger.info('STT Latency Calculator reset');
}
onTranscriptionReceived() {
if (!this.isRunning) {
return;
}
this.stt_on_transcription_time = Date.now();
this.logger.debug(`CallSession:on-transcription set to ${this.stt_on_transcription_time}`);
}
onTranscribeStop() {
if (!this.isRunning) {
return;
}
this.stt_stop_time = Date.now();
this.logger.debug(`CallSession:transcribe-stop set to ${this.stt_stop_time}`);
}
clearTalkspurts() {
this.talkspurts = [];
if (!this.isInTalkSpurt) {
this.start_talking_time = 0;
}
}
}
module.exports = SttLatencyCalculator;

View File

@@ -1,7 +1,4 @@
const {
TaskName,
} = require('./constants.json');
const {TaskName} = require('./constants.json');
const stickyVars = {
google: [
'GOOGLE_SPEECH_HINTS',
@@ -33,6 +30,7 @@ const stickyVars = {
'DEEPGRAM_SPEECH_TIER',
'DEEPGRAM_SPEECH_MODEL',
'DEEPGRAM_SPEECH_ENABLE_SMART_FORMAT',
'DEEPGRAM_SPEECH_ENABLE_NO_DELAY',
'DEEPGRAM_SPEECH_ENABLE_AUTOMATIC_PUNCTUATION',
'DEEPGRAM_SPEECH_PROFANITY_FILTER',
'DEEPGRAM_SPEECH_REDACT',
@@ -45,12 +43,21 @@ const stickyVars = {
'DEEPGRAM_SPEECH_ENDPOINTING',
'DEEPGRAM_SPEECH_UTTERANCE_END_MS',
'DEEPGRAM_SPEECH_VAD_TURNOFF',
'DEEPGRAM_SPEECH_TAG'
'DEEPGRAM_SPEECH_TAG',
'DEEPGRAM_SPEECH_MODEL_VERSION',
'DEEPGRAM_SPEECH_FILLER_WORDS',
'DEEPGRAM_SPEECH_KEYTERMS',
],
aws: [
'AWS_VOCABULARY_NAME',
'AWS_VOCABULARY_FILTER_METHOD',
'AWS_VOCABULARY_FILTER_NAME'
'AWS_VOCABULARY_FILTER_NAME',
'AWS_LANGUAGE_MODEL_NAME',
'AWS_ACCESS_KEY_ID',
'AWS_SECRET_ACCESS_KEY',
'AWS_REGION',
'AWS_SECURITY_TOKEN',
'AWS_PII_ENTITY_TYPES',
],
nuance: [
'NUANCE_ACCESS_TOKEN',
@@ -99,7 +106,68 @@ const stickyVars = {
assemblyai: [
'ASSEMBLYAI_API_KEY',
'ASSEMBLYAI_WORD_BOOST'
]
],
voxist: [
'VOXIST_API_KEY',
],
cartesia: [
'CARTESIA_API_KEY',
'CARTESIA_MODEL_ID'
],
speechmatics: [
'SPEECHMATICS_API_KEY',
'SPEECHMATICS_HOST',
'SPEECHMATICS_PATH',
'SPEECHMATICS_SPEECH_HINTS',
'SPEECHMATICS_TRANSLATION_LANGUAGES',
'SPEECHMATICS_TRANSLATION_PARTIALS'
],
openai: [
'OPENAI_API_KEY',
'OPENAI_MODEL',
'OPENAI_INPUT_AUDIO_NOISE_REDUCTION',
'OPENAI_TURN_DETECTION_TYPE',
'OPENAI_TURN_DETECTION_THRESHOLD',
'OPENAI_TURN_DETECTION_PREFIX_PADDING_MS',
'OPENAI_TURN_DETECTION_SILENCE_DURATION_MS',
],
houndify: [
'HOUNDIFY_CLIENT_ID',
'HOUNDIFY_CLIENT_KEY',
'HOUNDIFY_USER_ID',
'HOUNDIFY_MAX_SILENCE_SECONDS',
'HOUNDIFY_MAX_SILENCE_AFTER_FULL_QUERY_SECONDS',
'HOUNDIFY_MAX_SILENCE_AFTER_PARTIAL_QUERY_SECONDS',
'HOUNDIFY_VAD_SENSITIVITY',
'HOUNDIFY_VAD_TIMEOUT',
'HOUNDIFY_VAD_MODE',
'HOUNDIFY_VAD_VOICE_MS',
'HOUNDIFY_VAD_SILENCE_MS',
'HOUNDIFY_VAD_DEBUG',
'HOUNDIFY_AUDIO_FORMAT',
'HOUNDIFY_ENABLE_NOISE_REDUCTION',
'HOUNDIFY_AUDIO_ENDPOINT',
'HOUNDIFY_ENABLE_PROFANITY_FILTER',
'HOUNDIFY_ENABLE_PUNCTUATION',
'HOUNDIFY_ENABLE_CAPITALIZATION',
'HOUNDIFY_CONFIDENCE_THRESHOLD',
'HOUNDIFY_ENABLE_DISFLUENCY_FILTER',
'HOUNDIFY_MAX_RESULTS',
'HOUNDIFY_ENABLE_WORD_TIMESTAMPS',
'HOUNDIFY_MAX_ALTERNATIVES',
'HOUNDIFY_PARTIAL_TRANSCRIPT_INTERVAL',
'HOUNDIFY_SESSION_TIMEOUT',
'HOUNDIFY_CONNECTION_TIMEOUT',
'HOUNDIFY_LATITUDE',
'HOUNDIFY_LONGITUDE',
'HOUNDIFY_CITY',
'HOUNDIFY_STATE',
'HOUNDIFY_COUNTRY',
'HOUNDIFY_TIMEZONE',
'HOUNDIFY_DOMAIN',
'HOUNDIFY_CUSTOM_VOCABULARY',
'HOUNDIFY_LANGUAGE_MODEL'
],
};
/**
@@ -141,7 +209,6 @@ const optimalDeepramModels = {
tr: ['nova-2', 'nova-2'],
uk: ['nova-2', 'nova-2']
};
const selectDefaultDeepgramModel = (task, language) => {
if (language in optimalDeepramModels) {
const [gather, transcribe] = optimalDeepramModels[language];
@@ -150,8 +217,34 @@ const selectDefaultDeepgramModel = (task, language) => {
return 'base';
};
const optimalGoogleModels = {
'v1' : {
'en-IN':['telephony', 'telephony'],
'es-DO':['default', 'default'],
'es-MX':['default', 'default'],
'en-AU':['telephony', 'telephony'],
'en-GB':['telephony', 'telephony'],
'en-NZ':['telephony', 'telephony']
},
'v2' : {
'en-IN':['telephony', 'long']
}
};
const selectDefaultGoogleModel = (task, language, version) => {
const useV2 = version === 'v2';
if (language in optimalGoogleModels[version]) {
const [gather, transcribe] = optimalGoogleModels[version][language];
return task.name === TaskName.Gather ? gather : transcribe;
}
return task.name === TaskName.Gather ?
(useV2 ? 'telephony_short' : 'command_and_search') :
(useV2 ? 'long' : 'latest_long');
};
const consolidateTranscripts = (bufferedTranscripts, channel, language, vendor) => {
if (bufferedTranscripts.length === 1) return bufferedTranscripts[0];
if (bufferedTranscripts.length === 1) {
bufferedTranscripts[0].is_final = true;
return bufferedTranscripts[0];
}
let totalConfidence = 0;
const finalTranscript = bufferedTranscripts.reduce((acc, evt) => {
totalConfidence += evt.alternatives[0].confidence;
@@ -171,7 +264,7 @@ const consolidateTranscripts = (bufferedTranscripts, channel, language, vendor)
const lastChar = acc.alternatives[0].transcript.slice(-1);
const firstChar = newTranscript.charAt(0);
if (lastChar.match(/\d/) && firstChar.match(/\d/)) {
if (vendor === 'speechmatics' || (lastChar.match(/\d/) && firstChar.match(/\d/))) {
acc.alternatives[0].transcript += newTranscript;
} else {
acc.alternatives[0].transcript += ` ${newTranscript}`;
@@ -261,13 +354,18 @@ const normalizeDeepgram = (evt, channel, language, shortUtterance) => {
confidence: alt.confidence,
transcript: alt.transcript,
}));
/**
* Some models (nova-2-general) return the detected language in the
* alternatives.languages array if the language is set as multi.
* If the language is detected, we use it as the language_code.
*/
const detectedLanguage = evt.channel?.alternatives?.[0]?.languages?.[0];
/**
* note difference between is_final and speech_final in Deepgram:
* https://developers.deepgram.com/docs/understand-endpointing-interim-results
*/
return {
language_code: language,
language_code: detectedLanguage || language,
channel_tag: channel,
is_final: shortUtterance ? evt.is_final : evt.speech_final,
alternatives: alternatives.length ? [alternatives[0]] : [],
@@ -278,6 +376,61 @@ const normalizeDeepgram = (evt, channel, language, shortUtterance) => {
};
};
const normalizeGladia = (evt, channel, language, shortUtterance) => {
const copy = JSON.parse(JSON.stringify(evt));
// Handle Gladia transcript format
if (evt.type === 'transcript' && evt.data && evt.data.utterance) {
const utterance = evt.data.utterance;
const alternatives = [{
confidence: utterance.confidence || 0,
transcript: utterance.text || '',
}];
return {
language_code: utterance.language || language,
channel_tag: channel,
is_final: evt.data.is_final || false,
alternatives,
vendor: {
name: 'gladia',
evt: copy
}
};
}
};
const normalizeDeepgramFlux = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
let turnTakingEvent;
if (['StartOfTurn', 'EagerEndOfTurn', 'TurnResumed', 'EndOfTurn'].includes(evt.event)) {
turnTakingEvent = evt.event;
}
/* calculate total confidence based on word-level confidence */
const realWords = (evt.words || [])
.filter((w) => ![',.!?;'].includes(w.word));
const confidence = realWords.length > 0 ? realWords.reduce((acc, w) => acc + w.confidence, 0) / realWords.length : 0;
return {
language_code: language,
channel_tag: channel,
is_final: evt.event === 'EndOfTurn',
alternatives: [
{
confidence,
end_of_turn_confidence: evt.end_of_turn_confidence,
transcript: evt.transcript,
...(turnTakingEvent && {turn_taking_event: turnTakingEvent})
}
],
vendor: {
name: 'deepgramflux',
evt: copy
}
};
};
const normalizeNvidia = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
const alternatives = (evt.alternatives || [])
@@ -316,8 +469,10 @@ const normalizeIbm = (evt, channel, language) => {
const normalizeGoogle = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
const language_code = evt.language_code || language;
return {
language_code: language,
language_code: language_code,
channel_tag: channel,
is_final: evt.is_final,
alternatives: [evt.alternatives[0]],
@@ -421,37 +576,189 @@ const normalizeMicrosoft = (evt, channel, language, punctuation = true) => {
const normalizeAws = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
const isGrpcPayload = Array.isArray(evt);
if (isGrpcPayload) {
/* legacy grpc api */
return {
language_code: language,
channel_tag: channel,
is_final: evt[0].is_final,
alternatives: evt[0].alternatives,
vendor: {
name: 'aws',
evt: copy
}
};
}
else {
/* websocket api */
const alternatives = evt.Transcript?.Results[0]?.Alternatives.map((alt) => {
const items = alt.Items.filter((item) => item.Type === 'pronunciation' && 'Confidence' in item);
const confidence = items.reduce((acc, item) => acc + item.Confidence, 0) / items.length;
return {
transcript: alt.Transcript,
confidence
};
});
return {
language_code: language,
channel_tag: channel,
is_final: evt.Transcript?.Results[0].IsPartial === false,
alternatives,
vendor: {
name: 'aws',
evt: copy
}
};
}
};
const normalizeAssemblyAi = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
const alternatives = [];
let is_final = false;
if (evt.type && evt.type === 'Turn') {
// v3 is here
alternatives.push({
confidence: evt.end_of_turn_confidence,
transcript: evt.transcript,
});
is_final = evt.end_of_turn;
} else {
alternatives.push({
confidence: evt.confidence,
transcript: evt.text,
});
is_final = evt.message_type === 'FinalTranscript';
}
return {
language_code: language,
channel_tag: channel,
is_final: evt[0].is_final,
alternatives: evt[0].alternatives,
is_final,
alternatives,
vendor: {
name: 'aws',
name: 'assemblyai',
evt: copy
}
};
};
const normalizeAssemblyAi = (evt, channel, language) => {
const normalizeHoundify = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
const alternatives = [];
const is_final = evt.ResultsAreFinal && evt.ResultsAreFinal[0] === true;
if (evt.Disambiguation && evt.Disambiguation.ChoiceData && evt.Disambiguation.ChoiceData.length > 0) {
// Handle Houndify Voice Search Result format
const choiceData = evt.Disambiguation.ChoiceData[0];
alternatives.push({
confidence: choiceData.ConfidenceScore || choiceData.ASRConfidence || 0.0,
transcript: choiceData.FormattedTranscription || choiceData.Transcription || '',
});
}
return {
language_code: language,
channel_tag: channel,
is_final,
alternatives,
vendor: {
name: 'houndify',
evt: copy
}
};
};
const normalizeVoxist = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
return {
language_code: language,
channel_tag: channel,
is_final: evt.message_type === 'FinalTranscript',
is_final: evt.type === 'final',
alternatives: [
{
confidence: evt.confidence,
confidence: 1.00,
transcript: evt.text,
}
],
vendor: {
name: 'ASSEMBLYAI',
name: 'voxist',
evt: copy
}
};
};
const normalizeCartesia = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
return {
language_code: language,
channel_tag: channel,
is_final: evt.is_final,
alternatives: [
{
confidence: 1.00,
transcript: evt.text,
}
],
vendor: {
name: 'cartesia',
evt: copy
}
};
};
const normalizeSpeechmatics = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
const is_final = evt.message === 'AddTranscript';
const words = evt.results?.filter((r) => r.type === 'word') || [];
const confidence = words.length > 0 ?
words.reduce((acc, word) => acc + word.alternatives[0].confidence, 0) / words.length :
0;
const alternative = {
confidence,
transcript: evt.metadata?.transcript
};
const obj = {
language_code: language,
channel_tag: channel,
is_final,
alternatives: [alternative],
vendor: {
name: 'speechmatics',
evt: copy
}
};
return obj;
};
const calculateConfidence = (logprobsArray) => {
// Sum the individual log probabilities
const totalLogProb = logprobsArray.reduce((sum, tokenInfo) => sum + tokenInfo.logprob, 0);
// Convert the total log probability back to a regular probability
const confidence = Math.exp(totalLogProb);
return confidence;
};
const normalizeOpenAI = (evt, channel, language) => {
const copy = JSON.parse(JSON.stringify(evt));
const obj = {
language_code: language,
channel_tag: channel,
is_final: true,
alternatives: [
{
transcript: evt.transcript,
confidence: evt.logprobs ? calculateConfidence(evt.logprobs) : 1.0,
}
],
vendor: {
name: 'openai',
evt: copy
}
};
return obj;
};
module.exports = (logger) => {
const normalizeTranscription = (evt, vendor, channel, language, shortUtterance, punctuation) => {
@@ -459,6 +766,10 @@ module.exports = (logger) => {
switch (vendor) {
case 'deepgram':
return normalizeDeepgram(evt, channel, language, shortUtterance);
case 'gladia':
return normalizeGladia(evt, channel, language, shortUtterance);
case 'deepgramflux':
return normalizeDeepgramFlux(evt, channel, language, shortUtterance);
case 'microsoft':
return normalizeMicrosoft(evt, channel, language, punctuation);
case 'google':
@@ -477,8 +788,18 @@ module.exports = (logger) => {
return normalizeCobalt(evt, channel, language);
case 'assemblyai':
return normalizeAssemblyAi(evt, channel, language, shortUtterance);
case 'houndify':
return normalizeHoundify(evt, channel, language, shortUtterance);
case 'voxist':
return normalizeVoxist(evt, channel, language);
case 'cartesia':
return normalizeCartesia(evt, channel, language);
case 'verbio':
return normalizeVerbio(evt, channel, language);
case 'speechmatics':
return normalizeSpeechmatics(evt, channel, language);
case 'openai':
return normalizeOpenAI(evt, channel, language);
default:
if (vendor.startsWith('custom:')) {
return normalizeCustom(evt, channel, language, vendor);
@@ -494,9 +815,9 @@ module.exports = (logger) => {
if ('google' === vendor) {
const useV2 = rOpts.googleOptions?.serviceVersion === 'v2';
const model = task.name === TaskName.Gather ?
(useV2 ? 'telephony_short' : 'command_and_search') :
(useV2 ? 'long' : 'latest_long');
const version = useV2 ? 'v2' : 'v1';
let {model} = rOpts;
model = model || selectDefaultGoogleModel(task, language, version);
opts = {
...opts,
...(sttCredentials && {GOOGLE_APPLICATION_CREDENTIALS: JSON.stringify(sttCredentials.credentials)}),
@@ -552,17 +873,32 @@ module.exports = (logger) => {
};
}
else if (['aws', 'polly'].includes(vendor)) {
const {awsOptions = {}} = rOpts;
const vocabularyName = awsOptions.vocabularyName || rOpts.vocabularyName;
const vocabularyFilterName = awsOptions.vocabularyFilterName || rOpts.vocabularyFilterName;
const filterMethod = awsOptions.vocabularyFilterMethod || rOpts.filterMethod;
opts = {
...opts,
...(rOpts.vocabularyName && {AWS_VOCABULARY_NAME: rOpts.vocabularyName}),
...(rOpts.vocabularyFilterName && {AWS_VOCABULARY_FILTER_NAME: rOpts.vocabularyFilterName}),
...(rOpts.filterMethod && {AWS_VOCABULARY_FILTER_METHOD: rOpts.filterMethod}),
...(vocabularyName && {AWS_VOCABULARY_NAME: vocabularyName}),
...(vocabularyFilterName && {AWS_VOCABULARY_FILTER_NAME: vocabularyFilterName}),
...(filterMethod && {AWS_VOCABULARY_FILTER_METHOD: filterMethod}),
...(sttCredentials && {
...(sttCredentials.accessKeyId && {AWS_ACCESS_KEY_ID: sttCredentials.accessKeyId}),
...(sttCredentials.secretAccessKey && {AWS_SECRET_ACCESS_KEY: sttCredentials.secretAccessKey}),
AWS_ACCESS_KEY_ID: sttCredentials.accessKeyId,
AWS_SECRET_ACCESS_KEY: sttCredentials.secretAccessKey,
AWS_REGION: sttCredentials.region,
...(sttCredentials.sessionToken && {AWS_SESSION_TOKEN: sttCredentials.sessionToken}),
AWS_SECURITY_TOKEN: sttCredentials.securityToken,
AWS_SESSION_TOKEN: sttCredentials.sessionToken ? sttCredentials.sessionToken : sttCredentials.securityToken
}),
...(awsOptions.accessKey && {AWS_ACCESS_KEY_ID: awsOptions.accessKey}),
...(awsOptions.secretKey && {AWS_SECRET_ACCESS_KEY: awsOptions.secretKey}),
...(awsOptions.region && {AWS_REGION: awsOptions.region}),
...(awsOptions.securityToken && {AWS_SECURITY_TOKEN: awsOptions.securityToken}),
...(awsOptions.sessionToken && {AWS_SESSION_TOKEN: awsOptions.sessionToken ?
awsOptions.sessionToken : awsOptions.securityToken}),
...(awsOptions.languageModelName && {AWS_LANGUAGE_MODEL_NAME: awsOptions.languageModelName}),
...(awsOptions.piiEntityTypes?.length && {AWS_PII_ENTITY_TYPES: awsOptions.piiEntityTypes.join(',')}),
...(awsOptions.piiIdentifyEntities && {AWS_PII_IDENTIFY_ENTITIES: true}),
...(awsOptions.languageModelName && {AWS_LANGUAGE_MODEL_NAME: awsOptions.languageModelName}),
};
}
else if ('microsoft' === vendor) {
@@ -601,6 +937,8 @@ module.exports = (logger) => {
//azureSttEndpointId overrides sttCredentials.custom_stt_endpoint
...(rOpts.azureSttEndpointId &&
{AZURE_SERVICE_ENDPOINT_ID: rOpts.azureSttEndpointId}),
...(azureOptions.speechRecognitionMode &&
{AZURE_RECOGNITION_MODE: azureOptions.speechRecognitionMode}),
};
}
else if ('nuance' === vendor) {
@@ -652,11 +990,19 @@ module.exports = (logger) => {
};
}
else if ('deepgram' === vendor) {
let {model} = rOpts;
let model = rOpts.deepgramOptions?.model || rOpts.model || sttCredentials.model_id;
const {deepgramOptions = {}} = rOpts;
const deepgramUri = deepgramOptions.deepgramSttUri || sttCredentials.deepgram_stt_uri;
const useTls = deepgramOptions.deepgramSttUseTls || sttCredentials.deepgram_stt_use_tls;
// DH (2025-08-11) entity_prompt is currently limited to 100 words
const entityPrompt = deepgramOptions.entityPrompt ?
deepgramOptions.entityPrompt
.split(/\s+/)
.slice(0, 100)
.join(' ')
: undefined;
/* default to a sensible model if not supplied */
if (!model) {
model = selectDefaultDeepgramModel(task, language);
@@ -674,6 +1020,8 @@ module.exports = (logger) => {
{DEEPGRAM_SPEECH_ENABLE_AUTOMATIC_PUNCTUATION: 1},
...(deepgramOptions.smartFormatting) &&
{DEEPGRAM_SPEECH_ENABLE_SMART_FORMAT: 1},
...(deepgramOptions.noDelay) &&
{DEEPGRAM_SPEECH_ENABLE_NO_DELAY: 1},
...(deepgramOptions.profanityFilter) &&
{DEEPGRAM_SPEECH_PROFANITY_FILTER: 1},
...(deepgramOptions.redact) &&
@@ -707,7 +1055,41 @@ module.exports = (logger) => {
...(deepgramOptions.vadTurnoff) &&
{DEEPGRAM_SPEECH_VAD_TURNOFF: deepgramOptions.vadTurnoff},
...(deepgramOptions.tag) &&
{DEEPGRAM_SPEECH_TAG: deepgramOptions.tag}
{DEEPGRAM_SPEECH_TAG: deepgramOptions.tag},
...(deepgramOptions.version) &&
{DEEPGRAM_SPEECH_MODEL_VERSION: deepgramOptions.version},
...(deepgramOptions.fillerWords) &&
{DEEPGRAM_SPEECH_FILLER_WORDS: deepgramOptions.fillerWords},
...((Array.isArray(deepgramOptions.keyterms) && deepgramOptions.keyterms.length > 0) &&
{DEEPGRAM_SPEECH_KEYTERMS: deepgramOptions.keyterms.join(',')}),
...(deepgramOptions.mipOptOut && {DEEPGRAM_SPEECH_MIP_OPT_OUT: deepgramOptions.mipOptOut}),
...(entityPrompt && {DEEPGRAM_SPEECH_ENTITY_PROMPT: entityPrompt}),
};
}
else if ('deepgramflux' === vendor) {
const {
eotThreshold,
eotTimeoutMs,
mipOptOut,
model,
eagerEotThreshold,
keyterms
} = rOpts.deepgramOptions || {};
opts = {
DEEPGRAMFLUX_API_KEY: sttCredentials.api_key,
DEEPGRAMFLUX_SPEECH_MODEL: model || 'flux-general-en',
...(eotThreshold && {DEEPGRAMFLUX_SPEECH_EOT_THRESHOLD: eotThreshold}),
...(eotTimeoutMs && {DEEPGRAMFLUX_SPEECH_EOT_TIMEOUT_MS: eotTimeoutMs}),
...(mipOptOut && {DEEPGRAMFLUX_SPEECH_MIP_OPT_OUT: mipOptOut}),
...(eagerEotThreshold && {DEEPGRAMFLUX_SPEECH_EAGER_EOT_THRESHOLD: eagerEotThreshold}),
...(keyterms && keyterms.length > 0 && {DEEPGRAMFLUX_SPEECH_KEYTERMS: keyterms.join(',')}),
};
}
else if ('gladia' === vendor) {
const {host, path} = sttCredentials;
opts = {
GLADIA_SPEECH_HOST: host,
GLADIA_SPEECH_PATH: path,
};
}
else if ('soniox' === vendor) {
@@ -806,15 +1188,135 @@ module.exports = (logger) => {
...(cobaltOptions.enableConfusionNetwork && {COBALT_ENABLE_CONFUSION_NETWORK: 1}),
...(cobaltOptions.compiledContextData && {COBALT_COMPILED_CONTEXT_DATA: cobaltOptions.compiledContextData}),
};
} else if ('assemblyai' === vendor) {
}
else if ('assemblyai' === vendor) {
const serviceVersion = rOpts.assemblyAiOptions?.serviceVersion || sttCredentials.service_version || 'v2';
const {
formatTurns,
endOfTurnConfidenceThreshold,
minEndOfTurnSilenceWhenConfident,
maxTurnSilence
} = rOpts.assemblyAiOptions || {};
opts = {
...opts,
ASSEMBLYAI_API_VERSION: serviceVersion,
...(serviceVersion === 'v3' && {
...(formatTurns && {
ASSEMBLYAI_FORMAT_TURNS: formatTurns
}),
...(endOfTurnConfidenceThreshold && {
ASSEMBLYAI_END_OF_TURN_CONFIDENCE_THRESHOLD: endOfTurnConfidenceThreshold
}),
ASSEMBLYAI_MIN_END_OF_TURN_SILENCE_WHEN_CONFIDENT: minEndOfTurnSilenceWhenConfident || 500,
...(maxTurnSilence && {
ASSEMBLYAI_MAX_TURN_SILENCE: maxTurnSilence
}),
}),
...(sttCredentials.api_key) &&
{ASSEMBLYAI_API_KEY: sttCredentials.api_key},
...(rOpts.hints?.length > 0 &&
{ASSEMBLYAI_WORD_BOOST: JSON.stringify(rOpts.hints)})
};
} else if ('verbio' === vendor) {
}
else if ('houndify' === vendor) {
const {
latitude, longitude, city, state, country, timeZone, domain, audioEndpoint,
maxSilenceSeconds, maxSilenceAfterFullQuerySeconds, maxSilenceAfterPartialQuerySeconds,
vadSensitivity, vadTimeout, vadMode, vadVoiceMs, vadSilenceMs, vadDebug,
audioFormat, enableNoiseReduction, enableProfanityFilter, enablePunctuation,
enableCapitalization, confidenceThreshold, enableDisfluencyFilter,
maxResults, enableWordTimestamps, maxAlternatives, partialTranscriptInterval,
sessionTimeout, connectionTimeout, customVocabulary, languageModel
} = rOpts.houndifyOptions || {};
opts = {
...opts,
HOUNDIFY_CLIENT_ID: sttCredentials.client_id,
HOUNDIFY_CLIENT_KEY: sttCredentials.client_key,
HOUNDIFY_USER_ID: sttCredentials.user_id,
HOUNDIFY_MAX_SILENCE_SECONDS: maxSilenceSeconds || 5,
HOUNDIFY_MAX_SILENCE_AFTER_FULL_QUERY_SECONDS: maxSilenceAfterFullQuerySeconds || 1,
HOUNDIFY_MAX_SILENCE_AFTER_PARTIAL_QUERY_SECONDS: maxSilenceAfterPartialQuerySeconds || 1.5,
...(vadSensitivity && {HOUNDIFY_VAD_SENSITIVITY: vadSensitivity}),
...(vadTimeout && {HOUNDIFY_VAD_TIMEOUT: vadTimeout}),
...(vadMode && {HOUNDIFY_VAD_MODE: vadMode}),
...(vadVoiceMs && {HOUNDIFY_VAD_VOICE_MS: vadVoiceMs}),
...(vadSilenceMs && {HOUNDIFY_VAD_SILENCE_MS: vadSilenceMs}),
...(vadDebug && {HOUNDIFY_VAD_DEBUG: vadDebug}),
...(audioFormat && {HOUNDIFY_AUDIO_FORMAT: audioFormat}),
...(enableNoiseReduction && {HOUNDIFY_ENABLE_NOISE_REDUCTION: enableNoiseReduction}),
...(enableProfanityFilter && {HOUNDIFY_ENABLE_PROFANITY_FILTER: enableProfanityFilter}),
...(enablePunctuation && {HOUNDIFY_ENABLE_PUNCTUATION: enablePunctuation}),
...(enableCapitalization && {HOUNDIFY_ENABLE_CAPITALIZATION: enableCapitalization}),
...(confidenceThreshold && {HOUNDIFY_CONFIDENCE_THRESHOLD: confidenceThreshold}),
...(enableDisfluencyFilter && {HOUNDIFY_ENABLE_DISFLUENCY_FILTER: enableDisfluencyFilter}),
...(maxResults && {HOUNDIFY_MAX_RESULTS: maxResults}),
...(enableWordTimestamps && {HOUNDIFY_ENABLE_WORD_TIMESTAMPS: enableWordTimestamps}),
...(maxAlternatives && {HOUNDIFY_MAX_ALTERNATIVES: maxAlternatives}),
...(partialTranscriptInterval && {HOUNDIFY_PARTIAL_TRANSCRIPT_INTERVAL: partialTranscriptInterval}),
...(sessionTimeout && {HOUNDIFY_SESSION_TIMEOUT: sessionTimeout}),
...(connectionTimeout && {HOUNDIFY_CONNECTION_TIMEOUT: connectionTimeout}),
...(latitude && {HOUNDIFY_LATITUDE: latitude}),
...(longitude && {HOUNDIFY_LONGITUDE: longitude}),
...(city && {HOUNDIFY_CITY: city}),
...(state && {HOUNDIFY_STATE: state}),
...(country && {HOUNDIFY_COUNTRY: country}),
...(timeZone && {HOUNDIFY_TIMEZONE: timeZone}),
...(domain && {HOUNDIFY_DOMAIN: domain}),
...(audioEndpoint && {HOUNDIFY_AUDIO_ENDPOINT: audioEndpoint}),
...(customVocabulary && {HOUNDIFY_CUSTOM_VOCABULARY:
Array.isArray(customVocabulary) ? customVocabulary.join(',') : customVocabulary}),
...(languageModel && {HOUNDIFY_LANGUAGE_MODEL: languageModel}),
};
}
else if ('voxist' === vendor) {
opts = {
...opts,
...(sttCredentials.api_key) &&
{VOXIST_API_KEY: sttCredentials.api_key},
};
}
else if ('cartesia' === vendor) {
opts = {
...opts,
...(sttCredentials.api_key &&
{CARTESIA_API_KEY: sttCredentials.api_key}),
...(sttCredentials.stt_model_id && {
CARTESIA_MODEL_ID: sttCredentials.stt_model_id
})
};
}
else if ('openai' === vendor) {
const {openaiOptions = {}} = rOpts;
const model = openaiOptions.model || rOpts.model || sttCredentials.model_id || 'whisper-1';
const apiKey = openaiOptions.apiKey || sttCredentials.api_key;
opts = {
OPENAI_MODEL: model,
OPENAI_API_KEY: apiKey,
...opts,
...(openaiOptions.prompt && {OPENAI_PROMPT: openaiOptions.prompt}),
...(openaiOptions.input_audio_noise_reduction &&
{OPENAI_INPUT_AUDIO_NOISE_REDUCTION: openaiOptions.input_audio_noise_reduction}),
};
if (openaiOptions.turn_detection) {
opts = {
...opts,
OPENAI_TURN_DETECTION_TYPE: openaiOptions.turn_detection.type,
...(openaiOptions.turn_detection.threshold && {
OPENAI_TURN_DETECTION_THRESHOLD: openaiOptions.turn_detection.threshold
}),
...(openaiOptions.turn_detection.prefix_padding_ms && {
OPENAI_TURN_DETECTION_PREFIX_PADDING_MS: openaiOptions.turn_detection.prefix_padding_ms
}),
...(openaiOptions.turn_detection.silence_duration_ms && {
OPENAI_TURN_DETECTION_SILENCE_DURATION_MS: openaiOptions.turn_detection.silence_duration_ms
}),
};
}
}
else if ('verbio' === vendor) {
const {verbioOptions = {}} = rOpts;
opts = {
...opts,
@@ -833,8 +1335,55 @@ module.exports = (logger) => {
...(verbioOptions.speech_incomplete_timeout &&
{VERBIO_SPEECH_INCOMPLETE_TIMEOUT: verbioOptions.speech_incomplete_timeout}),
};
} else if (vendor.startsWith('custom:')) {
let {options = {}} = rOpts;
}
else if ('speechmatics' === vendor) {
const {speechmaticsOptions = {}} = rOpts;
opts = {
...opts,
...(sttCredentials.api_key) && {SPEECHMATICS_API_KEY: sttCredentials.api_key},
...(sttCredentials.speechmatics_stt_uri) && {SPEECHMATICS_HOST: sttCredentials.speechmatics_stt_uri},
...(rOpts.hints?.length > 0 && {SPEECHMATICS_SPEECH_HINTS: rOpts.hints.join(',')}),
...(speechmaticsOptions.translation_config &&
{
SPEECHMATICS_TRANSLATION_LANGUAGES: speechmaticsOptions.translation_config.target_languages.join(','),
SPEECHMATICS_TRANSLATION_PARTIALS: speechmaticsOptions.translation_config.enable_partials ? 1 : 0
}
),
...(speechmaticsOptions.transcription_config?.domain &&
{SPEECHMATICS_DOMAIN: speechmaticsOptions.transcription_config.domain}),
...{SPEECHMATICS_MAX_DELAY: speechmaticsOptions.transcription_config?.max_delay || 0.7},
...{SPEECHMATICS_MAX_DELAY_MODE: speechmaticsOptions.transcription_config?.max_delay_mode || 'flexible'},
...(speechmaticsOptions.transcription_config?.diarization &&
{SPEECHMATICS_DIARIZATION: speechmaticsOptions.transcription_config.diarization}),
...(speechmaticsOptions.transcription_config?.speaker_diarization_config?.speaker_sensitivity &&
{SPEECHMATICS_DIARIZATION_SPEAKER_SENSITIVITY:
speechmaticsOptions.transcription_config.speaker_diarization_config.speaker_sensitivity}),
...(speechmaticsOptions.transcription_config?.speaker_diarization_config?.max_speakers &&
{SPEECHMATICS_DIARIZATION_MAX_SPEAKERS:
speechmaticsOptions.transcription_config.speaker_diarization_config.max_speakers}),
...(speechmaticsOptions.transcription_config?.output_locale &&
{SPEECHMATICS_OUTPUT_LOCALE: speechmaticsOptions.transcription_config.output_locale}),
...(speechmaticsOptions.transcription_config?.punctuation_overrides?.permitted_marks &&
{SPEECHMATICS_PUNCTUATION_ALLOWED:
speechmaticsOptions.transcription_config.punctuation_overrides.permitted_marks.join(',')}),
...(speechmaticsOptions.transcription_config?.punctuation_overrides?.sensitivity &&
{SPEECHMATICS_PUNCTUATION_SENSITIVITY:
speechmaticsOptions.transcription_config?.punctuation_overrides?.sensitivity}),
...(speechmaticsOptions.transcription_config?.operating_point &&
{SPEECHMATICS_OPERATING_POINT: speechmaticsOptions.transcription_config.operating_point}),
...(speechmaticsOptions.transcription_config?.enable_entities &&
{SPEECHMATICS_ENABLE_ENTTIES: speechmaticsOptions.transcription_config.enable_entities}),
...(speechmaticsOptions.transcription_config?.audio_filtering_config?.volume_threshold &&
{SPEECHMATICS_VOLUME_THRESHOLD:
speechmaticsOptions.transcription_config.audio_filtering_config.volume_threshold}),
...(speechmaticsOptions.transcription_config?.transcript_filtering_config?.remove_disfluencies &&
{SPEECHMATICS_REMOVE_DISFLUENCIES:
speechmaticsOptions.transcription_config.transcript_filtering_config.remove_disfluencies})
};
}
else if (vendor.startsWith('custom:')) {
let {options = {}} = rOpts.customOptions || {};
const {sampleRate} = rOpts.customOptions || {};
const {auth_token, custom_stt_url} = sttCredentials;
options = {
...options,
@@ -842,14 +1391,15 @@ module.exports = (logger) => {
{hints: rOpts.hints}),
...(rOpts.hints?.length > 0 && typeof rOpts.hints[0] === 'object' &&
{hints: JSON.stringify(rOpts.hints)}),
...(typeof rOpts.hintsBoost === 'number' && {hintsBoost: rOpts.hintsBoost})
...(typeof rOpts.hintsBoost === 'number' && {hintsBoost: rOpts.hintsBoost}),
...(task.cs?.callSid && {callSid: task.cs.callSid})
};
opts = {
...opts,
...(auth_token && {JAMBONZ_STT_API_KEY: auth_token}),
JAMBONZ_STT_URL: custom_stt_url,
...(Object.keys(options).length > 0 && {JAMBONZ_STT_OPTIONS: JSON.stringify(options)}),
...(sampleRate && {JAMBONZ_STT_SAMPLING: sampleRate})
};
}
@@ -899,6 +1449,6 @@ module.exports = (logger) => {
setChannelVarsForStt,
setSpeechCredentialsAtRuntime,
compileSonioxTranscripts,
consolidateTranscripts
consolidateTranscripts,
};
};

View File

@@ -0,0 +1,467 @@
const Emitter = require('events');
const assert = require('assert');
const {
TtsStreamingEvents,
TtsStreamingConnectionStatus
} = require('../utils/constants');
const MAX_CHUNK_SIZE = 1800;
const HIGH_WATER_BUFFER_SIZE = 1000;
const LOW_WATER_BUFFER_SIZE = 200;
const TIMEOUT_RETRY_MSECS = 1000; // 1 second
const isWhitespace = (str) => /^\s*$/.test(str);
/**
* Each queue item is an object:
* - { type: 'text', value: '…' } for text tokens.
* - { type: 'flush' } for a flush command.
*/
class TtsStreamingBuffer extends Emitter {
constructor(cs) {
super();
this.cs = cs;
this.logger = cs.logger;
// Use an array to hold our structured items.
this.queue = [];
// Track total number of characters in text items.
this.bufferedLength = 0;
this.eventHandlers = [];
this._isFull = false;
this._connectionStatus = TtsStreamingConnectionStatus.NotConnected;
this.timer = null;
// Record the last time the text buffer was updated.
this.lastUpdateTime = 0;
}
get isEmpty() {
return this.queue.length === 0;
}
get size() {
return this.bufferedLength;
}
get isFull() {
return this._isFull;
}
get ep() {
return this.cs?.ep;
}
async start() {
assert.ok(
this._connectionStatus === TtsStreamingConnectionStatus.NotConnected,
'TtsStreamingBuffer:start already started, or has failed'
);
this.vendor = this.cs.getTsStreamingVendor();
if (!this.vendor) {
this.logger.info('TtsStreamingBuffer:start No TTS streaming vendor configured');
throw new Error('No TTS streaming vendor configured');
}
this.logger.info(`TtsStreamingBuffer:start Connecting to TTS streaming with vendor ${this.vendor}`);
this._connectionStatus = TtsStreamingConnectionStatus.Connecting;
try {
if (this.eventHandlers.length === 0) this._initHandlers(this.ep);
await this._api(this.ep, [this.ep.uuid, 'connect']);
} catch (err) {
this.logger.info({ err }, 'TtsStreamingBuffer:start Error connecting to TTS streaming');
this._connectionStatus = TtsStreamingConnectionStatus.Failed;
}
}
stop() {
clearTimeout(this.timer);
this.removeCustomEventListeners();
if (this.ep) {
this._api(this.ep, [this.ep.uuid, 'stop'])
.catch((err) =>
this.logger.info({ err }, 'TtsStreamingBuffer:stop Error closing TTS streaming')
);
}
this.timer = null;
this.queue = [];
this.bufferedLength = 0;
this._connectionStatus = TtsStreamingConnectionStatus.NotConnected;
}
/**
* Buffer new text tokens.
*/
async bufferTokens(tokens) {
if (this._connectionStatus === TtsStreamingConnectionStatus.Failed) {
this.logger.info('TtsStreamingBuffer:bufferTokens TTS streaming connection failed, rejecting request');
return { status: 'failed', reason: `connection to ${this.vendor} failed` };
}
if (0 === this.bufferedLength && isWhitespace(tokens)) {
this.logger.debug({tokens}, 'TtsStreamingBuffer:bufferTokens discarded whitespace tokens');
return { status: 'ok' };
}
const displayedTokens = tokens.length <= 40 ? tokens : tokens.substring(0, 40);
const totalLength = tokens.length;
if (this.bufferedLength + totalLength > HIGH_WATER_BUFFER_SIZE) {
this.logger.info(
`TtsStreamingBuffer throttling: buffer is full, rejecting request to buffer ${totalLength} tokens`
);
if (!this._isFull) {
this._isFull = true;
this.emit(TtsStreamingEvents.Pause);
}
return { status: 'failed', reason: 'full' };
}
this.logger.debug(
`TtsStreamingBuffer:bufferTokens "${displayedTokens}" (length: ${totalLength})`
);
this.queue.push({ type: 'text', value: tokens });
this.bufferedLength += totalLength;
// Update the last update time each time new text is buffered.
this.lastUpdateTime = Date.now();
await this._feedQueue();
return { status: 'ok' };
}
/**
* Insert a flush command. If no text is queued, flush immediately.
* Otherwise, append a flush marker so that all text preceding it will be sent
* (regardless of sentence boundaries) before the flush is issued.
*/
flush() {
if (this._connectionStatus === TtsStreamingConnectionStatus.Connecting) {
this.logger.debug('TtsStreamingBuffer:flush TTS stream is not quite ready - wait for connect');
if (this.queue.length === 0 || this.queue[this.queue.length - 1].type !== 'flush') {
this.queue.push({ type: 'flush' });
}
return;
}
else if (this._connectionStatus === TtsStreamingConnectionStatus.Connected) {
if (this.isEmpty) {
this._doFlush();
}
else {
if (this.queue[this.queue.length - 1].type !== 'flush') {
this.queue.push({ type: 'flush' });
this.logger.debug('TtsStreamingBuffer:flush added flush marker to queue');
}
}
}
else {
this.logger.debug(
`TtsStreamingBuffer:flush TTS stream is not connected, status: ${this._connectionStatus}`
);
}
}
clear() {
this.logger.debug('TtsStreamingBuffer:clear');
if (this._connectionStatus !== TtsStreamingConnectionStatus.Connected) return;
clearTimeout(this.timer);
this._api(this.ep, [this.ep.uuid, 'clear']).catch((err) =>
this.logger.info({ err }, 'TtsStreamingBuffer:clear Error clearing TTS streaming')
);
this.queue = [];
this.bufferedLength = 0;
this.timer = null;
this._isFull = false;
}
/**
* Process the queue in two phases.
*
* Phase 1: Look for flush markers. When a flush marker is found (even if not at the very front),
* send all text tokens that came before it immediately (ignoring sentence boundaries)
* and then send the flush command. Repeat until there are no flush markers left.
*
* Phase 2: With the remaining queue (now containing only text items), accumulate text
* up to MAX_CHUNK_SIZE and use sentence-boundary logic to determine a chunk.
* Then, remove the exact tokens (or portions thereof) that were consumed.
*/
async _feedQueue(handlingTimeout = false) {
this.logger.debug({ queue: this.queue }, 'TtsStreamingBuffer:_feedQueue');
try {
if (!this.cs.isTtsStreamOpen || !this.ep) {
this.logger.debug('TtsStreamingBuffer:_feedQueue TTS stream is not open or no endpoint available');
return;
}
if (this._connectionStatus !== TtsStreamingConnectionStatus.Connected) {
this.logger.debug('TtsStreamingBuffer:_feedQueue TTS stream is not connected');
return;
}
// --- Phase 1: Process flush markers ---
// Process any flush marker that isnt in the very first position.
let flushIndex = this.queue.findIndex((item, idx) => item.type === 'flush' && idx > 0);
while (flushIndex !== -1) {
let flushText = '';
// Accumulate all text tokens preceding the flush marker.
for (let i = 0; i < flushIndex; i++) {
if (this.queue[i].type === 'text') {
flushText += this.queue[i].value;
}
}
// Remove those text items.
for (let i = 0; i < flushIndex; i++) {
const item = this.queue.shift();
if (item.type === 'text') {
this.bufferedLength -= item.value.length;
}
}
// Remove the flush marker (now at the front).
if (this.queue.length > 0 && this.queue[0].type === 'flush') {
this.queue.shift();
}
// Immediately send all accumulated text (ignoring sentence boundaries).
if (flushText.length > 0) {
const modifiedFlushText = flushText.replace(/\n\n/g, '\n \n');
try {
await this._api(this.ep, [this.ep.uuid, 'send', modifiedFlushText]);
} catch (err) {
this.logger.info({ err, flushText }, 'TtsStreamingBuffer:_feedQueue Error sending TTS chunk');
}
}
// Send the flush command.
await this._doFlush();
flushIndex = this.queue.findIndex((item, idx) => item.type === 'flush' && idx > 0);
}
// If a flush marker is at the very front, process it.
while (this.queue.length > 0 && this.queue[0].type === 'flush') {
this.queue.shift();
await this._doFlush();
}
// --- Phase 2: Process remaining text tokens ---
if (this.queue.length === 0) {
this._removeTimer();
return;
}
// Accumulate contiguous text tokens (from the front) up to MAX_CHUNK_SIZE.
let combinedText = '';
for (const item of this.queue) {
if (item.type !== 'text') break;
combinedText += item.value;
if (combinedText.length >= MAX_CHUNK_SIZE) break;
}
if (combinedText.length === 0) {
this._removeTimer();
return;
}
const limit = Math.min(MAX_CHUNK_SIZE, combinedText.length);
let chunkEnd = findSentenceBoundary(combinedText, limit);
if (chunkEnd <= 0) {
if (handlingTimeout) {
chunkEnd = findWordBoundary(combinedText, limit);
if (chunkEnd <= 0) {
this._setTimerIfNeeded();
return;
}
} else {
this._setTimerIfNeeded();
return;
}
}
const chunk = combinedText.slice(0, chunkEnd);
// Check if the chunk is only whitespace before processing the queue
// If so, wait for more meaningful text
if (isWhitespace(chunk)) {
this.logger.debug('TtsStreamingBuffer:_feedQueue chunk is only whitespace, waiting for more text');
this._setTimerIfNeeded();
return;
}
// Now we iterate over the queue items
// and deduct their lengths until we've accounted for chunkEnd characters.
let remaining = chunkEnd;
let tokensProcessed = 0;
for (let i = 0; i < this.queue.length; i++) {
const token = this.queue[i];
if (token.type !== 'text') break;
if (remaining >= token.value.length) {
remaining -= token.value.length;
tokensProcessed = i + 1;
} else {
// Partially consumed token: update its value to remove the consumed part.
token.value = token.value.slice(remaining);
tokensProcessed = i;
remaining = 0;
break;
}
}
// Remove the fully consumed tokens from the front of the queue.
this.queue.splice(0, tokensProcessed);
this.bufferedLength -= chunkEnd;
const modifiedChunk = chunk.replace(/\n\n/g, '\n \n');
if (isWhitespace(modifiedChunk)) {
this.logger.debug('TtsStreamingBuffer:_feedQueue modified chunk is only whitespace, restoring queue');
this.queue.unshift({ type: 'text', value: chunk });
this.bufferedLength += chunkEnd;
this._setTimerIfNeeded();
return;
}
this.logger.debug(`TtsStreamingBuffer:_feedQueue sending chunk to tts: ${modifiedChunk}`);
try {
await this._api(this.ep, [this.ep.uuid, 'send', modifiedChunk]);
} catch (err) {
this.logger.info({ err, chunk }, 'TtsStreamingBuffer:_feedQueue Error sending TTS chunk');
}
if (this._isFull && this.bufferedLength <= LOW_WATER_BUFFER_SIZE) {
this.logger.info('TtsStreamingBuffer throttling: buffer is no longer full - resuming');
this._isFull = false;
this.emit(TtsStreamingEvents.Resume);
}
return this._feedQueue();
} catch (err) {
this.logger.info({ err }, 'TtsStreamingBuffer:_feedQueue Error sending TTS chunk');
this.queue = [];
this.bufferedLength = 0;
}
}
async _api(ep, args) {
const apiCmd = `uuid_${this.vendor.startsWith('custom:') ? 'custom' : this.vendor}_tts_streaming`;
const res = await ep.api(apiCmd, `^^|${args.join('|')}`);
if (!res.body?.startsWith('+OK')) {
this.logger.info({ args }, `Error calling ${apiCmd}: ${res.body}`);
throw new Error(`Error calling ${apiCmd}: ${res.body}`);
}
}
_doFlush() {
return this._api(this.ep, [this.ep.uuid, 'flush'])
.then(() => this.logger.debug('TtsStreamingBuffer:_doFlush sent flush command'))
.catch((err) =>
this.logger.info(
{ err },
`TtsStreamingBuffer:_doFlush Error flushing TTS streaming: ${JSON.stringify(err)}`
)
);
}
async _onConnect(vendor) {
this.logger.info(`TtsStreamingBuffer:_onConnect streaming tts connection made to ${vendor} successful`);
this._connectionStatus = TtsStreamingConnectionStatus.Connected;
if (this.queue.length > 0) {
await this._feedQueue();
}
this.emit(TtsStreamingEvents.Connected, { vendor });
}
_onConnectFailure(vendor) {
this.logger.info(`TtsStreamingBuffer:_onConnectFailure streaming tts connection failed to ${vendor}`);
this._connectionStatus = TtsStreamingConnectionStatus.Failed;
this.queue = [];
this.bufferedLength = 0;
this.emit(TtsStreamingEvents.ConnectFailure, { vendor });
}
_setTimerIfNeeded() {
if (this.bufferedLength > 0 && !this.timer) {
this.logger.debug({queue: this.queue},
`TtsStreamingBuffer:_setTimerIfNeeded setting timer because ${this.bufferedLength} buffered`);
this.timer = setTimeout(this._onTimeout.bind(this), TIMEOUT_RETRY_MSECS);
}
}
_removeTimer() {
if (this.timer) {
this.logger.debug('TtsStreamingBuffer:_removeTimer clearing timer');
clearTimeout(this.timer);
this.timer = null;
}
}
_onTimeout() {
this.logger.debug('TtsStreamingBuffer:_onTimeout Timeout waiting for sentence boundary');
this.timer = null;
// Check if new text has been added since the timer was set.
const now = Date.now();
if (now - this.lastUpdateTime < TIMEOUT_RETRY_MSECS) {
this.logger.debug('TtsStreamingBuffer:_onTimeout New text received recently; postponing flush.');
this._setTimerIfNeeded();
return;
}
this._feedQueue(true);
}
_onTtsEmpty(vendor) {
this.emit(TtsStreamingEvents.Empty, { vendor });
}
addCustomEventListener(ep, event, handler) {
this.eventHandlers.push({ ep, event, handler });
ep.addCustomEventListener(event, handler);
}
removeCustomEventListeners() {
this.eventHandlers.forEach((h) => h.ep.removeCustomEventListener(h.event, h.handler));
this.eventHandlers.length = 0;
}
_initHandlers(ep) {
[
'deepgram',
'cartesia',
'elevenlabs',
'rimelabs',
'custom'
].forEach((vendor) => {
const eventClassName = `${vendor.charAt(0).toUpperCase() + vendor.slice(1)}TtsStreamingEvents`;
const eventClass = require('../utils/constants')[eventClassName];
if (!eventClass) throw new Error(`Event class for vendor ${vendor} not found`);
this.addCustomEventListener(ep, eventClass.Connect, this._onConnect.bind(this, vendor));
this.addCustomEventListener(ep, eventClass.ConnectFailure, this._onConnectFailure.bind(this, vendor));
this.addCustomEventListener(ep, eventClass.Empty, this._onTtsEmpty.bind(this, vendor));
});
}
}
const findSentenceBoundary = (text, limit) => {
// Look for punctuation or double newline that signals sentence end.
const sentenceEndRegex = /[.!?](?=\s|$)|\n\n/g;
let lastSentenceBoundary = -1;
let match;
while ((match = sentenceEndRegex.exec(text)) && match.index < limit) {
const precedingText = text.slice(0, match.index).trim();
if (precedingText.length > 0) {
if (
match[0] === '\n\n' ||
(match.index === 0 || !/\d$/.test(text[match.index - 1]))
) {
lastSentenceBoundary = match.index + (match[0] === '\n\n' ? 2 : 1);
}
}
}
return lastSentenceBoundary;
};
const findWordBoundary = (text, limit) => {
const wordBoundaryRegex = /\s+/g;
let lastWordBoundary = -1;
let match;
while ((match = wordBoundaryRegex.exec(text)) && match.index < limit) {
lastWordBoundary = match.index;
}
return lastWordBoundary;
};
module.exports = TtsStreamingBuffer;

View File

@@ -1,7 +1,8 @@
const assert = require('assert');
const BaseRequestor = require('./base-requestor');
const short = require('short-uuid');
const {HookMsgTypes} = require('./constants.json');
const parseUrl = require('parse-url');
const {HookMsgTypes, WS_CLOSE_CODES} = require('./constants.json');
const Websocket = require('ws');
const snakeCaseKeys = require('./snakecase-keys');
const {
@@ -12,6 +13,20 @@ const {
JAMBONES_WS_MAX_PAYLOAD,
HTTP_USER_AGENT_HEADER
} = require('../config');
const MTYPE_WANTS_ACK = [
'call:status',
'verb:status',
'jambonz:error',
'llm:event',
'llm:tool-call',
'tts:streaming-event',
'tts:tokens-result',
];
const MTYPE_NO_DATA = [
'llm:tool-output',
'tts:flush',
'tts:clear'
];
class WsRequestor extends BaseRequestor {
constructor(logger, account_sid, hook, secret) {
@@ -27,6 +42,19 @@ class WsRequestor extends BaseRequestor {
assert(this._isAbsoluteUrl(this.url));
const parsedUrl = parseUrl(this.url);
const hash = parsedUrl.hash || '';
const hashObj = hash ? this._parseHashParams(hash) : {};
// remove hash
this.cleanUrl = hash ? this.url.replace(`#${hash}`, '') : this.url;
// Retry policy: rp valid values: 4xx, 5xx, ct, rt, all, default is ct
// Retry count: rc valid values: 1-5, default is 5 for websockets
this.maxReconnects = Math.min(Math.abs(parseInt(hashObj.rc) || MAX_RECONNECTS), 5);
this.retryPolicy = hashObj.rp || 'ct';
this.retryPolicyValues = this.retryPolicy.split(',').map((v) => v.trim());
this.on('socket-closed', this._onSocketClosed.bind(this));
}
@@ -41,10 +69,10 @@ class WsRequestor extends BaseRequestor {
* @param {string} [hook.password] - if basic auth is protecting the endpoint
* @param {object} [params] - request parameters
*/
async request(type, hook, params, httpHeaders = {}) {
async request(type, hook, params, httpHeaders = {}, span) {
assert(HookMsgTypes.includes(type));
const url = hook.url || hook;
const wantsAck = !['call:status', 'verb:status', 'jambonz:error'].includes(type);
const wantsAck = !MTYPE_WANTS_ACK.includes(type);
if (this.maliciousClient) {
this.logger.info({url: this.url}, 'WsRequestor:request - discarding msg to malicious client');
@@ -73,7 +101,7 @@ class WsRequestor extends BaseRequestor {
this.close();
this.emit('handover', requestor);
}
return requestor.request(type, hook, params, httpHeaders);
return requestor.request(type, hook, params, httpHeaders, span);
}
/* connect if necessary */
@@ -97,16 +125,56 @@ class WsRequestor extends BaseRequestor {
}
this.connectInProgress = true;
this.logger.debug(`WsRequestor:request(${this.id}) - connecting since we do not have a connection for ${type}`);
if (this.connections >= MAX_RECONNECTS) {
return Promise.reject(`max attempts connecting to ${this.url}`);
}
try {
const startAt = process.hrtime();
await this._connect();
const rtt = this._roundTrip(startAt);
this.stats.histogram('app.hook.connect_time', rtt, ['hook_type:app']);
let retryCount = 0;
let lastError = null;
while (retryCount <= this.maxReconnects) {
try {
this.logger.debug({retryCount, maxReconnects: this.maxReconnects},
'WsRequestor:request - attempting connection retry');
// Ensure clean state before each connection attempt
if (this.ws) {
this.ws.removeAllListeners();
this.ws = null;
}
const startAt = process.hrtime();
await this._connect();
const rtt = this._roundTrip(startAt);
this.stats.histogram('app.hook.connect_time', rtt, ['hook_type:app']);
lastError = null;
break;
} catch (error) {
lastError = error;
retryCount++;
if (retryCount <= this.maxReconnects &&
this.retryPolicyValues?.length &&
this._shouldRetry(error, this.retryPolicyValues)) {
const delay = this.backoffMs;
this.backoffMs = this.backoffMs < 2000 ? this.backoffMs * 2 : (this.backoffMs + 2000);
this.logger.debug({delay}, 'WsRequestor:request - waiting before retry');
await new Promise((resolve) => setTimeout(resolve, delay));
continue;
}
this.logger.error({error: error.message, retryCount, maxReconnects: this.maxReconnects},
'WsRequestor:request - all connection attempts failed');
throw lastError;
}
}
// If we exit the loop without success, throw the last error
if (lastError) {
throw lastError;
}
} catch (err) {
this.logger.info({url, err}, 'WsRequestor:request - failed connecting');
this.logger.info({url, err, retryPolicy: this.retryPolicy},
'WsRequestor:request - all connection attempts failed');
this.connectInProgress = false;
return Promise.reject(err);
}
@@ -118,8 +186,8 @@ class WsRequestor extends BaseRequestor {
assert(this.ws);
/* prepare and send message */
let payload = params ? snakeCaseKeys(params, ['customerData', 'sip']) : null;
if (type === 'session:new') this._sessionData = payload;
let payload = params ? snakeCaseKeys(params, ['customerData', 'sip', 'env_vars', 'args']) : null;
if (type === 'session:new' || type === 'session:adulting') this._sessionData = payload;
if (type === 'session:reconnect') payload = this._sessionData;
assert.ok(url, 'WsRequestor:request url was not provided');
@@ -132,17 +200,23 @@ class WsRequestor extends BaseRequestor {
type,
msgid,
call_sid: this.call_sid,
hook: ['verb:hook', 'session:redirect'].includes(type) ? url : undefined,
hook: [
'verb:hook', 'dial:confirm', 'session:redirect', 'llm:event', 'llm:tool-call'
].includes(type) ? url : undefined,
data: {...payload},
...b3
};
// add msgid to span attributes if it exists
if (span) {
span.setAttributes({'msgid': msgid});
}
const sendQueuedMsgs = () => {
if (this.queuedMsg.length > 0) {
for (const {type, hook, params, httpHeaders, promise} of this.queuedMsg) {
this.logger.debug(`WsRequestor:request - preparing queued ${type} for sending`);
if (promise) {
this.request(type, hook, params, httpHeaders)
this.request(type, hook, params, httpHeaders, span)
.then((res) => promise.resolve(res))
.catch((err) => promise.reject(err));
}
@@ -219,7 +293,7 @@ class WsRequestor extends BaseRequestor {
/* send the message */
this.ws.send(JSON.stringify(obj), async() => {
this.logger.debug({obj}, `WsRequestor:request websocket: sent (${url})`);
if (obj.type !== 'llm:event') this.logger.debug({obj}, `WsRequestor:request websocket: sent (${url})`);
// If session:reconnect is waiting for ack, hold here until ack to send queuedMsgs
if (this._reconnectPromise) {
try {
@@ -247,13 +321,13 @@ class WsRequestor extends BaseRequestor {
}
}
close() {
close(code = WS_CLOSE_CODES.NormalClosure) {
this.closedGracefully = true;
this.logger.debug('WsRequestor:close closing socket');
this.logger.debug(`WsRequestor:close closing socket with code ${code}`);
this._stopPingTimer();
try {
if (this.ws) {
this.ws.close(1000);
this.ws.close(code);
this.ws.removeAllListeners();
this.ws = null;
}
@@ -281,17 +355,23 @@ class WsRequestor extends BaseRequestor {
};
if (this.username && this.password) opts = {...opts, auth: `${this.username}:${this.password}`};
// Clean up any existing connection event listeners to prevent interference between retry attempts
this.removeAllListeners('ready');
this.removeAllListeners('not-ready');
this
.once('ready', (ws) => {
this.logger.debug('WsRequestor:_connect - ready event fired, resolving Promise');
this.removeAllListeners('not-ready');
if (this.connections > 1) this.request('session:reconnect', this.url);
resolve();
})
.once('not-ready', (err) => {
this.logger.error({err: err.message}, 'WsRequestor:_connect - not-ready event fired, rejecting Promise');
this.removeAllListeners('ready');
reject(err);
});
const ws = new Websocket(this.url, ['ws.jambonz.org'], opts);
const ws = new Websocket(this.cleanUrl, ['ws.jambonz.org'], opts);
this._setHandlers(ws);
});
}
@@ -315,10 +395,13 @@ class WsRequestor extends BaseRequestor {
}
_onError(err) {
if (this.connections > 0) {
this.logger.info({url: this.url, err}, 'WsRequestor:_onError');
if (this.connectInProgress) {
this.logger.info({url: this.url, err}, 'WsRequestor:_onError - emitting not-ready for connection attempt');
this.emit('not-ready', err);
}
else if (this.connections === 0) {
this.emit('not-ready', err);
}
else this.emit('not-ready', err);
}
_onOpen(ws) {
@@ -355,30 +438,44 @@ class WsRequestor extends BaseRequestor {
statusMessage: res.statusMessage
}, 'WsRequestor - unexpected response');
this.emit('connection-failure');
this.emit('not-ready', new Error(`${res.statusCode} ${res.statusMessage}`));
this.connections++;
const error = new Error(`${res.statusCode} ${res.statusMessage}`);
error.statusCode = res.statusCode;
this.connectInProgress = false;
this.emit('not-ready', error);
}
_onSocketClosed() {
this.ws = null;
this.emit('connection-dropped');
this._stopPingTimer();
if (this.connections > 0 && this.connections < MAX_RECONNECTS && !this.closedGracefully) {
if (this.connections > 0 && this.connections < this.maxReconnects && !this.closedGracefully) {
if (!this._initMsgId) this._clearPendingMessages();
this.logger.debug(`WsRequestor:_onSocketClosed waiting ${this.backoffMs} to reconnect`);
setTimeout(() => {
this._scheduleReconnect('_onSocketClosed');
}
}
_scheduleReconnect(source) {
this.logger.debug(`WsRequestor:_scheduleReconnect waiting ${this.backoffMs} to reconnect (${source})`);
setTimeout(() => {
this.logger.debug(
{haveWs: !!this.ws, connectInProgress: this.connectInProgress},
`WsRequestor:_scheduleReconnect time to reconnect (${source})`);
if (!this.ws && !this.connectInProgress) {
this.connectInProgress = true;
return this._connect()
.catch((err) => this.logger.error(`WsRequestor:${source} There is error while reconnect`, err))
.finally(() => this.connectInProgress = false);
} else {
this.logger.debug(
{haveWs: !!this.ws, connectInProgress: this.connectInProgress},
'WsRequestor:_onSocketClosed time to reconnect');
if (!this.ws && !this.connectInProgress) {
this.connectInProgress = true;
return this._connect()
.catch((err) => this.logger.error('WsRequestor:_onSocketClosed There is error while reconnect', err))
.finally(() => this.connectInProgress = false);
}
}, this.backoffMs);
this.backoffMs = this.backoffMs < 2000 ? this.backoffMs * 2 : (this.backoffMs + 2000);
}
`WsRequestor:_scheduleReconnect skipping reconnect attempt (${source}) - conditions not met`);
}
}, this.backoffMs);
this.backoffMs = this.backoffMs < 2000 ? this.backoffMs * 2 : (this.backoffMs + 2000);
}
_onMessage(content, isBinary) {
@@ -392,8 +489,9 @@ class WsRequestor extends BaseRequestor {
/* messages must be JSON format */
try {
const obj = JSON.parse(content);
this.logger.debug({obj}, 'WsRequestor:_onMessage - received message');
//const {type, msgid, command, call_sid = this.call_sid, queueCommand = false, data} = obj;
const {type, msgid, command, queueCommand = false, data} = obj;
const {type, msgid, command, queueCommand = false, tool_call_id, data} = obj;
const call_sid = obj.callSid || this.call_sid;
//this.logger.debug({obj}, 'WsRequestor:request websocket: received');
@@ -407,8 +505,8 @@ class WsRequestor extends BaseRequestor {
case 'command':
assert.ok(command, 'command property not supplied');
assert.ok(data, 'data property not supplied');
this._recvCommand(msgid, command, call_sid, queueCommand, data);
assert.ok(data || MTYPE_NO_DATA.includes(command), 'data property not supplied');
this._recvCommand(msgid, command, call_sid, queueCommand, tool_call_id, data);
break;
default:
@@ -416,6 +514,21 @@ class WsRequestor extends BaseRequestor {
}
} catch (err) {
this.logger.info({err, content}, 'WsRequestor:_onMessage - invalid incoming message');
const params = {
msg: 'InvalidMessage',
details: err.message,
content: Buffer.from(content).toString('utf-8')
};
const {writeAlerts, AlertType} = this.Alerter;
writeAlerts({
account_sid: this.account_sid,
alert_type: AlertType.INVALID_APP_PAYLOAD,
target_sid: this.call_sid,
message: err.message,
}).catch((err) => this.logger.info({err}, 'Error generating alert for invalid message'));
this.request('jambonz:error', '/error', params)
.catch((err) => this.logger.debug({err}, 'WsRequestor:_onMessage - Error sending'));
}
}
@@ -432,10 +545,10 @@ class WsRequestor extends BaseRequestor {
success && success(data);
}
_recvCommand(msgid, command, call_sid, queueCommand, data) {
_recvCommand(msgid, command, call_sid, queueCommand, tool_call_id, data) {
// TODO: validate command
this.logger.debug({msgid, command, call_sid, queueCommand, data}, 'received command');
this.emit('command', {msgid, command, call_sid, queueCommand, data});
this.emit('command', {msgid, command, call_sid, queueCommand, tool_call_id, data});
}
}

14189
package-lock.json generated

File diff suppressed because it is too large Load Diff

View File

@@ -1,6 +1,6 @@
{
"name": "jambonz-feature-server",
"version": "0.9.0",
"version": "0.9.5",
"main": "app.js",
"engines": {
"node": ">= 18.x"
@@ -27,14 +27,15 @@
"dependencies": {
"@aws-sdk/client-auto-scaling": "^3.549.0",
"@aws-sdk/client-sns": "^3.549.0",
"@jambonz/db-helpers": "^0.9.6",
"@jambonz/db-helpers": "^0.9.18",
"@jambonz/http-health-check": "^0.0.1",
"@jambonz/mw-registrar": "^0.2.7",
"@jambonz/realtimedb-helpers": "^0.8.8",
"@jambonz/speech-utils": "^0.1.11",
"@jambonz/realtimedb-helpers": "^0.8.15",
"@jambonz/speech-utils": "^0.2.26",
"@jambonz/stats-collector": "^0.1.10",
"@jambonz/time-series": "^0.2.8",
"@jambonz/verb-specifications": "^0.0.74",
"@jambonz/time-series": "^0.2.14",
"@jambonz/verb-specifications": "^0.0.119",
"@modelcontextprotocol/sdk": "^1.9.0",
"@opentelemetry/api": "^1.8.0",
"@opentelemetry/exporter-jaeger": "^1.23.0",
"@opentelemetry/exporter-trace-otlp-http": "^0.50.0",
@@ -47,24 +48,21 @@
"bent": "^7.3.12",
"debug": "^4.3.4",
"deepcopy": "^2.1.0",
"drachtio-fsmrf": "^3.0.43",
"drachtio-srf": "^4.5.35",
"drachtio-fsmrf": "^4.1.2",
"drachtio-srf": "^5.0.14",
"express": "^4.19.2",
"express-validator": "^7.0.1",
"ip": "^2.0.1",
"moment": "^2.30.1",
"parse-url": "^9.2.0",
"pino": "^8.20.0",
"pino": "^10.1.0",
"polly-ssml-split": "^0.1.0",
"proxyquire": "^2.1.3",
"sdp-transform": "^2.14.2",
"sdp-transform": "^2.15.0",
"short-uuid": "^5.1.0",
"sinon": "^17.0.1",
"to-snake-case": "^1.0.0",
"undici": "^6.19.2",
"uuid-random": "^1.3.2",
"undici": "^7.5.0",
"verify-aws-sns-signature": "^0.1.0",
"ws": "^8.17.1",
"ws": "^8.18.0",
"xml2js": "^0.6.2"
},
"devDependencies": {
@@ -72,6 +70,7 @@
"eslint": "7.32.0",
"eslint-plugin-promise": "^6.1.1",
"nyc": "^15.1.0",
"proxyquire": "^2.1.3",
"tape": "^5.7.5"
},
"optionalDependencies": {

View File

@@ -222,3 +222,62 @@ test('test create-call app_json', async(t) => {
t.error(err);
}
});
test('test create-call timeLimit', async(t) => {
clearModule.all();
const {srf, disconnect} = require('../app');
try {
await connect(srf);
// GIVEN
let from = 'create-call-app-json';
let account_sid = 'bb845d4b-83a9-4cde-a6e9-50f3743bab3f';
// Give UAS app time to come up
const p = sippUac('uas.xml', '172.38.0.10', from);
await waitFor(1000);
const startTime = Date.now();
const app_json = `[
{
"verb": "pause",
"length": 7
}
]`;
const post = bent('http://127.0.0.1:3000/', 'POST', 'json', 201);
post('v1/createCall', {
'account_sid':account_sid,
"call_hook": {
"url": "http://127.0.0.1:3100/",
"method": "POST",
"username": "username",
"password": "password"
},
app_json,
"from": from,
"to": {
"type": "phone",
"number": "15583084809"
},
"timeLimit": 1,
"speech_recognizer_vendor": "google",
"speech_recognizer_language": "en"
});
//THEN
await p;
const endTime = Date.now();
t.ok(endTime - startTime < 2000, 'create-call: timeLimit is respected');
disconnect();
} catch (err) {
console.log(`error received: ${err}`);
disconnect();
t.error(err);
}
});

View File

@@ -1,4 +1,5 @@
/* SQLEditor (MySQL (2))*/
SET FOREIGN_KEY_CHECKS=0;
DROP TABLE IF EXISTS account_static_ips;
@@ -53,6 +54,8 @@ DROP TABLE IF EXISTS signup_history;
DROP TABLE IF EXISTS smpp_addresses;
DROP TABLE IF EXISTS google_custom_voices;
DROP TABLE IF EXISTS speech_credentials;
DROP TABLE IF EXISTS system_information;
@@ -136,6 +139,9 @@ account_sid CHAR(36) NOT NULL,
is_active BOOLEAN NOT NULL DEFAULT 1,
username VARCHAR(64),
password VARCHAR(1024),
allow_direct_app_calling BOOLEAN NOT NULL DEFAULT 1,
allow_direct_queue_calling BOOLEAN NOT NULL DEFAULT 1,
allow_direct_user_calling BOOLEAN NOT NULL DEFAULT 1,
PRIMARY KEY (client_sid)
);
@@ -338,11 +344,25 @@ label VARCHAR(64),
PRIMARY KEY (speech_credential_sid)
);
CREATE TABLE google_custom_voices
(
google_custom_voice_sid CHAR(36) NOT NULL UNIQUE ,
speech_credential_sid CHAR(36) NOT NULL,
model VARCHAR(512) NOT NULL,
reported_usage ENUM('REPORTED_USAGE_UNSPECIFIED','REALTIME','OFFLINE') DEFAULT 'REALTIME',
name VARCHAR(64) NOT NULL,
voice_cloning_key MEDIUMTEXT,
use_voice_cloning_key BOOLEAN DEFAULT false,
PRIMARY KEY (google_custom_voice_sid)
);
CREATE TABLE system_information
(
domain_name VARCHAR(255),
sip_domain_name VARCHAR(255),
monitoring_domain_name VARCHAR(255)
monitoring_domain_name VARCHAR(255),
private_network_cidr VARCHAR(8192),
log_level ENUM('info', 'debug') NOT NULL DEFAULT 'info'
);
CREATE TABLE users
@@ -437,11 +457,14 @@ CREATE TABLE sip_gateways
sip_gateway_sid CHAR(36),
ipv4 VARCHAR(128) NOT NULL COMMENT 'ip address or DNS name of the gateway. For gateways providing inbound calling service, ip address is required.',
netmask INTEGER NOT NULL DEFAULT 32,
port INTEGER NOT NULL DEFAULT 5060 COMMENT 'sip signaling port',
port INTEGER COMMENT 'sip signaling port',
inbound BOOLEAN NOT NULL COMMENT 'if true, whitelist this IP to allow inbound calls from the gateway',
outbound BOOLEAN NOT NULL COMMENT 'if true, include in least-cost routing when placing calls to the PSTN',
voip_carrier_sid CHAR(36) NOT NULL,
is_active BOOLEAN NOT NULL DEFAULT 1,
send_options_ping BOOLEAN NOT NULL DEFAULT 0,
use_sips_scheme BOOLEAN NOT NULL DEFAULT 0,
pad_crypto BOOLEAN NOT NULL DEFAULT 0,
protocol ENUM('udp','tcp','tls', 'tls/srtp') DEFAULT 'udp' COMMENT 'Outbound call protocol',
PRIMARY KEY (sip_gateway_sid)
) COMMENT='A whitelisted sip gateway used for origination/termination';
@@ -478,11 +501,19 @@ messaging_hook_sid CHAR(36) COMMENT 'webhook to call for inbound SMS/MMS ',
app_json TEXT,
speech_synthesis_vendor VARCHAR(64) NOT NULL DEFAULT 'google',
speech_synthesis_language VARCHAR(12) NOT NULL DEFAULT 'en-US',
speech_synthesis_voice VARCHAR(64),
speech_synthesis_voice VARCHAR(256),
speech_synthesis_label VARCHAR(64),
speech_recognizer_vendor VARCHAR(64) NOT NULL DEFAULT 'google',
speech_recognizer_language VARCHAR(64) NOT NULL DEFAULT 'en-US',
speech_recognizer_label VARCHAR(64),
use_for_fallback_speech BOOLEAN DEFAULT false,
fallback_speech_synthesis_vendor VARCHAR(64),
fallback_speech_synthesis_language VARCHAR(12),
fallback_speech_synthesis_voice VARCHAR(256),
fallback_speech_synthesis_label VARCHAR(64),
fallback_speech_recognizer_vendor VARCHAR(64),
fallback_speech_recognizer_language VARCHAR(64),
fallback_speech_recognizer_label VARCHAR(64),
created_at DATETIME NOT NULL DEFAULT CURRENT_TIMESTAMP,
record_all_calls BOOLEAN NOT NULL DEFAULT false,
PRIMARY KEY (application_sid)
@@ -525,6 +556,7 @@ siprec_hook_sid CHAR(36),
record_all_calls BOOLEAN NOT NULL DEFAULT false,
record_format VARCHAR(16) NOT NULL DEFAULT 'mp3',
bucket_credential VARCHAR(8192) COMMENT 'credential used to authenticate with storage service',
enable_debug_log BOOLEAN NOT NULL DEFAULT false,
PRIMARY KEY (account_sid)
) COMMENT='An enterprise that uses the platform for comm services';
@@ -619,6 +651,10 @@ ALTER TABLE speech_credentials ADD FOREIGN KEY service_provider_sid_idxfk_5 (ser
CREATE INDEX account_sid_idx ON speech_credentials (account_sid);
ALTER TABLE speech_credentials ADD FOREIGN KEY account_sid_idxfk_8 (account_sid) REFERENCES accounts (account_sid);
CREATE INDEX google_custom_voice_sid_idx ON google_custom_voices (google_custom_voice_sid);
CREATE INDEX speech_credential_sid_idx ON google_custom_voices (speech_credential_sid);
ALTER TABLE google_custom_voices ADD FOREIGN KEY speech_credential_sid_idxfk (speech_credential_sid) REFERENCES speech_credentials (speech_credential_sid) ON DELETE CASCADE;
CREATE INDEX user_sid_idx ON users (user_sid);
CREATE INDEX email_idx ON users (email);
CREATE INDEX phone_idx ON users (phone);
@@ -704,4 +740,5 @@ ALTER TABLE accounts ADD FOREIGN KEY queue_event_hook_sid_idxfk (queue_event_hoo
ALTER TABLE accounts ADD FOREIGN KEY device_calling_application_sid_idxfk (device_calling_application_sid) REFERENCES applications (application_sid);
ALTER TABLE accounts ADD FOREIGN KEY siprec_hook_sid_idxfk (siprec_hook_sid) REFERENCES applications (application_sid);
SET FOREIGN_KEY_CHECKS=1;
SET FOREIGN_KEY_CHECKS=1;

View File

@@ -3,9 +3,8 @@ const { sippUac } = require('./sipp')('test_fs');
const bent = require('bent');
const getJSON = bent('json')
const clearModule = require('clear-module');
const {provisionCallHook} = require('./utils')
const sleepFor = (ms) => new Promise((r) => setTimeout(r, ms));
const {provisionCallHook} = require('./utils');
const { sleepFor } = require('../lib/utils/helpers');
process.on('unhandledRejection', (reason, p) => {
console.log('Unhandled Rejection at: Promise', p, 'reason:', reason);

View File

@@ -42,7 +42,7 @@ services:
ipv4_address: 172.38.0.7
drachtio:
image: drachtio/drachtio-server:0.8.25-rc8
image: drachtio/drachtio-server:0.8.26
restart: always
command: drachtio --contact "sip:*;transport=udp" --mtu 4096 --address 0.0.0.0 --port 9022
ports:
@@ -57,7 +57,7 @@ services:
condition: service_healthy
freeswitch:
image: drachtio/drachtio-freeswitch-mrf:0.7.3
image: drachtio/drachtio-freeswitch-mrf:0.9.2-4
restart: always
command: freeswitch --rtp-range-start 20000 --rtp-range-end 20100
environment:

View File

@@ -0,0 +1,151 @@
// Test for HttpRequestor retry functionality
const test = require('tape');
const sinon = require('sinon');
const proxyquire = require('proxyquire').noCallThru();
const { createMocks, setupBaseRequestorMocks } = require('./utils/mock-helper');
// Create mocks
const mocks = createMocks();
// Mock timeSeries module
const timeSeriesMock = sinon.stub().returns(mocks.MockAlerter);
// Mock the config with required properties
const configMock = {
HTTP_POOL: '0',
HTTP_POOLSIZE: '10',
HTTP_PIPELINING: '1',
HTTP_TIMEOUT: 5000,
HTTP_PROXY_IP: null,
HTTP_PROXY_PORT: null,
HTTP_PROXY_PROTOCOL: null,
NODE_ENV: 'test',
HTTP_USER_AGENT_HEADER: 'test-agent'
};
// Mock db-helpers
const dbHelpersMock = mocks.MockDbHelpers;
// Require HttpRequestor with mocked dependencies
const BaseRequestor = proxyquire('../lib/utils/base-requestor', {
'@jambonz/time-series': timeSeriesMock,
'../config': configMock,
'../../': { srf: { locals: { stats: mocks.MockStats } } }
});
// Setup BaseRequestor mocks
setupBaseRequestorMocks(BaseRequestor);
// Require HttpRequestor with mocked dependencies
const HttpRequestor = proxyquire('../lib/utils/http-requestor', {
'./base-requestor': BaseRequestor,
'../config': configMock,
'@jambonz/db-helpers': dbHelpersMock
});
// Setup utility function
const setupRequestor = () => {
const hook = { url: 'http://localhost/test', method: 'POST' };
const requestor = new HttpRequestor(mocks.MockLogger, 'AC123', hook, 'testsecret');
requestor.stats = mocks.MockStats;
return requestor;
};
// Cleanup function for tests
const cleanup = (requestor) => {
sinon.restore();
if (requestor && requestor.close) requestor.close();
};
test('HttpRequestor: should retry on connection errors when specified in hash', async (t) => {
const requestor = setupRequestor();
// Setup a URL with retry params in the hash
const urlWithRetry = 'http://localhost/test#rc=3&rp=ct,5xx';
// First two calls fail with connection refused, third succeeds
const requestStub = sinon.stub(requestor.client, 'request');
const error = new Error('Connection refused');
error.code = 'ECONNREFUSED';
// Fail twice, succeed on third try
requestStub.onCall(0).rejects(error);
requestStub.onCall(1).rejects(error);
requestStub.onCall(2).resolves({
statusCode: 200,
headers: { 'content-type': 'application/json' },
body: { json: async () => ({ success: true }) }
});
try {
const hook = { url: urlWithRetry, method: 'GET' };
const result = await requestor.request('verb:hook', hook, null);
t.equal(requestStub.callCount, 3, 'Should have retried twice for a total of 3 calls');
t.deepEqual(result, { success: true }, 'Should return successful response');
} catch (err) {
t.fail(`Should not throw an error: ${err.message}`);
}
cleanup(requestor);
t.end();
});
test('HttpRequestor: should respect retry count (rc) from hash', async (t) => {
const requestor = setupRequestor();
// Setup a URL with retry params in the hash - only retry once
const urlWithRetry = 'http://localhost/test#rc=1&rp=ct';
// All calls fail with connection refused
const requestStub = sinon.stub(requestor.client, 'request');
const error = new Error('Connection refused');
error.code = 'ECONNREFUSED';
// Always fail
requestStub.rejects(error);
try {
const hook = { url: urlWithRetry, method: 'GET' };
await requestor.request('verb:hook', hook, null);
t.fail('Should have thrown an error');
} catch (err) {
t.equal(requestStub.callCount, 2, 'Should have retried once for a total of 2 calls');
t.equal(err.code, 'ECONNREFUSED', 'Should throw the original error');
}
cleanup(requestor);
t.end();
});
test('HttpRequestor: should respect retry policy (rp) from hash', async (t) => {
const requestor = setupRequestor();
// Setup a URL with retry params in hash - only retry on 5xx errors
const urlWithRetry = 'http://localhost/test#rc=2&rp=5xx';
// Fail with 404 (should not retry since rp=5xx)
const requestStub = sinon.stub(requestor.client, 'request');
requestStub.resolves({
statusCode: 404,
headers: {},
body: {}
});
try {
const hook = { url: urlWithRetry, method: 'GET' };
await requestor.request('verb:hook', hook, null);
t.fail('Should have thrown an error');
} catch (err) {
t.equal(requestStub.callCount, 1, 'Should not retry on 404 when rp=5xx');
t.equal(err.statusCode, 404, 'Should throw 404 error');
}
cleanup(requestor);
t.end();
});
module.exports = {
setupRequestor,
cleanup
};

View File

@@ -0,0 +1,214 @@
const test = require('tape');
const sinon = require('sinon');
const { createMockedRequestors } = require('./utils/test-mocks');
// Use the shared mocks and helpers
const {
HttpRequestor,
setupRequestor,
cleanup
} = createMockedRequestors();
// All prototype overrides and setup are now handled in test-mocks.js
// --- TESTS ---
test('HttpRequestor: constructor sets up properties correctly', (t) => {
const requestor = setupRequestor();
t.equal(requestor.method, 'POST', 'method should be POST');
t.equal(requestor.url, 'http://localhost/test', 'url should be set');
t.equal(typeof requestor.client, 'object', 'client should be an object');
cleanup(requestor);
t.end();
});
test('HttpRequestor: constructor with username/password sets auth header', (t) => {
const { mocks, HttpRequestor } = createMockedRequestors();
const logger = mocks.logger;
const hook = {
url: 'http://localhost/test',
method: 'POST',
username: 'user',
password: 'pass'
};
const requestor = new HttpRequestor(logger, 'AC123', hook, 'secret');
t.ok(requestor.authHeader.Authorization, 'Authorization header should be set');
t.ok(requestor.authHeader.Authorization.startsWith('Basic '), 'Should be Basic auth');
cleanup(requestor);
t.end();
});
test('HttpRequestor: request should return JSON on 200 response', async (t) => {
const requestor = setupRequestor();
const expectedResponse = { success: true, data: [1, 2, 3] };
const fakeBody = { json: async () => expectedResponse };
sinon.stub(requestor.client, 'request').resolves({
statusCode: 200,
headers: { 'content-type': 'application/json' },
body: fakeBody
});
try {
const hook = { url: 'http://localhost/test', method: 'POST' };
const result = await requestor.request('verb:hook', hook, { foo: 'bar' });
t.deepEqual(result, expectedResponse, 'Should return parsed JSON');
const requestCall = requestor.client.request.getCall(0);
const opts = requestCall.args[0];
t.equal(opts.method, 'POST', 'method should be POST');
t.ok(opts.headers['X-Signature'], 'Should include signature header');
t.ok(opts.body, 'Should include request body');
} catch (err) {
t.fail(err);
}
cleanup(requestor);
t.end();
});
test('HttpRequestor: request should handle non-200 responses', async (t) => {
const requestor = setupRequestor();
sinon.stub(requestor.client, 'request').resolves({
statusCode: 404,
headers: {},
body: {}
});
try {
const hook = { url: 'http://localhost/test', method: 'POST' };
await requestor.request('verb:hook', hook, { foo: 'bar' });
t.fail('Should have thrown an error');
} catch (err) {
t.ok(err, 'Should throw an error');
t.equal(err.statusCode, 404, 'Error should contain status code');
}
cleanup(requestor);
t.end();
});
test('HttpRequestor: request should handle ECONNREFUSED error', async (t) => {
const requestor = setupRequestor();
const error = new Error('Connection refused');
error.code = 'ECONNREFUSED';
sinon.stub(requestor.client, 'request').rejects(error);
try {
const hook = { url: 'http://localhost/test', method: 'POST' };
await requestor.request('verb:hook', hook, { foo: 'bar' });
t.fail('Should have thrown an error');
} catch (err) {
t.equal(err.code, 'ECONNREFUSED', 'Should pass through the error');
}
cleanup(requestor);
t.end();
});
test('HttpRequestor: request should skip jambonz:error type', async (t) => {
const requestor = setupRequestor();
const spy = sinon.spy(requestor.client, 'request');
const hook = { url: 'http://localhost/test', method: 'POST' };
const result = await requestor.request('jambonz:error', hook, { foo: 'bar' });
t.equal(result, undefined, 'Should return undefined');
t.equal(spy.callCount, 0, 'Should not call request method');
cleanup(requestor);
t.end();
});
test('HttpRequestor: request should handle array response', async (t) => {
const requestor = setupRequestor();
const fakeBody = { json: async () => [{ id: 1 }, { id: 2 }] };
sinon.stub(requestor.client, 'request').resolves({
statusCode: 200,
headers: { 'content-type': 'application/json' },
body: fakeBody
});
try {
const hook = { url: 'http://localhost/test', method: 'POST' };
const result = await requestor.request('verb:hook', hook, { foo: 'bar' });
t.ok(Array.isArray(result), 'Should return an array');
t.equal(result.length, 2, 'Array should have 2 items');
} catch (err) {
t.fail(err);
}
cleanup(requestor);
t.end();
});
test('HttpRequestor: request should handle llm:tool-call type', async (t) => {
const requestor = setupRequestor();
const fakeBody = { json: async () => ({ result: 'tool output' }) };
sinon.stub(requestor.client, 'request').resolves({
statusCode: 200,
headers: { 'content-type': 'application/json' },
body: fakeBody
});
try {
const hook = { url: 'http://localhost/test', method: 'POST' };
const result = await requestor.request('llm:tool-call', hook, { tool: 'test' });
t.deepEqual(result, { result: 'tool output' }, 'Should return the parsed JSON');
} catch (err) {
t.fail(err);
}
cleanup(requestor);
t.end();
});
test('HttpRequestor: close should close the client if not using pools', (t) => {
// Ensure HTTP_POOL is set to false to disable pool usage
const oldHttpPool = process.env.HTTP_POOL;
process.env.HTTP_POOL = '0';
const requestor = setupRequestor();
// Make sure _usePools is false
requestor._usePools = false;
// Replace the client.close with a spy function
const closeSpy = sinon.spy();
requestor.client.close = closeSpy;
// Set client.closed to false to ensure the condition is met
requestor.client.closed = false;
// Call close
requestor.close();
// Check if the spy was called
t.ok(closeSpy.calledOnce, 'Should call client.close');
// Restore HTTP_POOL
process.env.HTTP_POOL = oldHttpPool;
// Don't call cleanup(requestor) as it would try to call client.close again
sinon.restore();
t.end();
});
test('HttpRequestor: request should handle URLs with fragments', async (t) => {
const requestor = setupRequestor();
// Use the same host/port as the base client to avoid creating a new client
const urlWithFragment = 'http://localhost?param1=value1#rc=5&rp=4xx,5xx,ct';
const expectedResponse = { status: 'success' };
const fakeBody = { json: async () => expectedResponse };
// Stub the request method
const requestStub = sinon.stub(requestor.client, 'request').callsFake((opts) => {
return Promise.resolve({
statusCode: 200,
headers: { 'content-type': 'application/json' },
body: fakeBody
});
});
try {
const hook = { url: urlWithFragment, method: 'GET' };
const result = await requestor.request('verb:hook', hook, null);
t.deepEqual(result, expectedResponse, 'Should return the parsed JSON response');
const requestCall = requestStub.getCall(0);
const opts = requestCall.args[0];
t.ok(opts.query && opts.query.param1 === 'value1', 'Query parameters should be parsed');
t.equal(opts.path, '/', 'Path should be extracted from URL');
t.notOk(opts.query && opts.query.rc, 'Fragment should not be included in query parameters');
} catch (err) {
t.fail(err);
}
cleanup(requestor);
t.end();
});
// test('HttpRequestor: request should handle URLs with query parameters', async (t) => {
// t.pass('Restored original require function');
// t.end();
// });

View File

@@ -1,4 +1,8 @@
require('./ws-requestor-retry-unit-test');
require('./test_ws_retry_comprehensive');
require('./ws-requestor-unit-test');
require('./http-requestor-retry-test');
require('./http-requestor-unit-test');
require('./unit-tests');
require('./docker_start');
require('./create-test-db');
@@ -12,6 +16,7 @@ require('./sip-request-tests');
require('./create-call-test');
require('./play-tests');
require('./sip-refer-tests');
require('./sip-refer-handler-tests');
require('./listen-tests');
require('./config-test');
require('./queue-test');

View File

@@ -3,6 +3,7 @@ const { sippUac } = require('./sipp')('test_fs');
const clearModule = require('clear-module');
const {provisionCallHook, provisionActionHook, provisionAnyHook} = require('./utils');
const bent = require('bent');
const { sleepFor } = require('../lib/utils/helpers');
const getJSON = bent('json');
process.on('unhandledRejection', (reason, p) => {
@@ -17,8 +18,6 @@ function connect(connectable) {
});
}
const sleepFor = (ms) => new Promise((resolve) => setTimeout(() => resolve(), ms));
test('\'enqueue-dequeue\' tests', async(t) => {
clearModule.all();

View File

@@ -0,0 +1,117 @@
<?xml version="1.0" encoding="ISO-8859-1" ?>
<!DOCTYPE scenario SYSTEM "sipp.dtd">
<scenario name="UAS that accepts call and sends REFER">
<!-- Receive incoming INVITE -->
<recv request="INVITE" crlf="true">
<action>
<ereg regexp=".*" search_in="hdr" header="Subject:" assign_to="1" />
<ereg regexp=".*" search_in="hdr" header="From:" assign_to="2" />
</action>
</recv>
<!-- Send 180 Ringing -->
<send>
<![CDATA[
SIP/2.0 180 Ringing
[last_Via:]
[last_From:]
[last_To:];tag=[pid]SIPpTag01[call_number]
[last_Call-ID:]
[last_CSeq:]
[last_Record-Route:]
Subject:[$1]
Content-Length: 0
]]>
</send>
<!-- Send 200 OK with SDP -->
<send>
<![CDATA[
SIP/2.0 200 OK
[last_Via:]
[last_From:]
[last_To:];tag=[pid]SIPpTag01[call_number]
[last_Call-ID:]
[last_CSeq:]
[last_Record-Route:]
Subject:[$1]
Contact: <sip:[local_ip]:[local_port];transport=[transport]>
Content-Type: application/sdp
Content-Length: [len]
v=0
o=user1 53655765 2353687637 IN IP[local_ip_type] [local_ip]
s=-
c=IN IP[media_ip_type] [media_ip]
t=0 0
m=audio [media_port] RTP/AVP 0
a=rtpmap:0 PCMU/8000
]]>
</send>
<recv request="ACK" rtd="true" crlf="true">
<action>
<!-- Check if this is NOT the first call (tag ends with 012 or higher) -->
<ereg regexp="tag=1SIPpTag01[2-9]" search_in="hdr" header="To:" assign_to="3" />
<log message="Not first call check result: [$3]"/>
</action>
</recv>
<!-- Skip REFER if we found a non-first call tag -->
<nop next="skip_refer" test="3" value="" compare="not_equal">
<action>
<log message="Found non-first call tag [$3], skipping REFER"/>
</action>
</nop>
<!-- Wait a moment, then send REFER (only on first call) -->
<pause milliseconds="1000"/>
<nop>
<action>
<log message="Sending REFER for first call"/>
</action>
</nop>
<!-- Send REFER (only on first iteration) -->
<send retrans="500">
<![CDATA[
REFER sip:service@[remote_ip]:[remote_port] SIP/2.0
Via: SIP/2.0/[transport] [local_ip]:[local_port];branch=[branch]
From: <sip:[local_ip]:[local_port]>;tag=[pid]SIPpTag01[call_number]
To: [$2]
[last_Call-ID:]
CSeq: 2 REFER
Contact: <sip:[local_ip]:[local_port];transport=[transport]>
Max-Forwards: 70
X-Call-Number: [call_number]
Refer-To: <sip:+15551234567@example.com>
Referred-By: <sip:[local_ip]:[local_port]>
Content-Length: 0
]]>
</send>
<!-- Expect 202 Accepted (only on first iteration) -->
<recv response="202"/>
<label id="skip_refer"/>
<!-- Wait for BYE from feature server -->
<recv request="BYE"/>
<!-- Send 200 OK to BYE -->
<send>
<![CDATA[
SIP/2.0 200 OK
[last_Via:]
[last_From:]
[last_To:]
[last_Call-ID:]
[last_CSeq:]
Contact: <sip:[local_ip]:[local_port];transport=[transport]>
Content-Length: 0
]]>
</send>
</scenario>

View File

@@ -0,0 +1,90 @@
const test = require('tape');
const { sippUac } = require('./sipp')('test_fs');
const bent = require('bent');
const getJSON = bent('json')
const clearModule = require('clear-module');
const {provisionCallHook} = require('./utils');
const { sleepFor } = require('../lib/utils/helpers');
process.on('unhandledRejection', (reason, p) => {
console.log('Unhandled Rejection at: Promise', p, 'reason:', reason);
});
function connect(connectable) {
return new Promise((resolve, reject) => {
connectable.on('connect', () => {
return resolve();
});
});
}
test('when parent leg recvs REFER it should end the dial after adulting child leg', async(t) => {
clearModule.all();
const {srf, disconnect} = require('../app');
try {
await connect(srf);
// wait for fs connected to drachtio server.
await sleepFor(1000);
// GIVEN
const from = "dial_refer_handler";
let verbs = [
{
"verb": "dial",
"callerId": from,
"actionHook": "/actionHook",
"referHook": "/referHook",
"anchorMedia": true,
"target": [
{
"type": "phone",
"number": "15083084809"
}
]
}
];
await provisionCallHook(from, verbs);
// THEN
//const p = sippUac('uas-dial.xml', '172.38.0.10', undefined, undefined, 2);
const p = sippUac('uas-dial-refer.xml', '172.38.0.10', undefined, undefined, 2);
await sleepFor(1000);
let account_sid = '622f62e4-303a-49f2-bbe0-eb1e1714e37a';
let post = bent('http://127.0.0.1:3000/', 'POST', 'json', 201);
post('v1/createCall', {
'account_sid':account_sid,
"call_hook": {
"url": "http://127.0.0.1:3100/",
"method": "POST",
},
"from": from,
"to": {
"type": "phone",
"number": "15583084808"
}});
await p;
// Verify that the referHook was called
const obj = await getJSON(`http://127.0.0.1:3100/lastRequest/${from}_referHook`);
t.ok(obj.body.from === from,
'dial-refer-handler: referHook was called with correct from');
t.ok(obj.body.refer_details && obj.body.refer_details.sip_refer_to,
'dial-refer-handler: refer_details included in referHook');
t.ok(obj.body.refer_details.refer_to_user === '+15551234567',
'dial-refer-handler: refer_to_user correctly parsed');
t.ok(obj.body.refer_details.referring_call_sid,
'dial-refer-handler: referring_call_sid included');
t.ok(obj.body.refer_details.referred_call_sid,
'dial-refer-handler: referred_call_sid included');
disconnect();
} catch (err) {
console.log(`error received: ${err}`);
disconnect();
t.error(err);
}
});

View File

@@ -3,10 +3,9 @@ const { sippUac } = require('./sipp')('test_fs');
const clearModule = require('clear-module');
const {provisionCallHook, provisionCustomHook, provisionActionHook} = require('./utils')
const bent = require('bent');
const { sleepFor } = require('../lib/utils/helpers');
const getJSON = bent('json')
const sleepFor = async(ms) => new Promise(resolve => setTimeout(resolve, ms));
process.on('unhandledRejection', (reason, p) => {
console.log('Unhandled Rejection at: Promise', p, 'reason:', reason);
});
@@ -59,6 +58,46 @@ test('\'refer\' tests w/202 and NOTIFY', {timeout: 25000}, async(t) => {
}
});
test('\'refer\' tests tel:', {timeout: 25000}, async(t) => {
clearModule.all();
const {srf, disconnect} = require('../app');
try {
await connect(srf);
// GIVEN
const verbs = [
{
verb: 'say',
text: 'silence_stream://100'
},
{
verb: 'sip:refer',
referTo: 'tel:+1234567890',
actionHook: '/actionHook'
}
];
const noVerbs = [];
const from = 'refer_with_tel';
await provisionCallHook(from, verbs);
await provisionActionHook(from, noVerbs)
// THEN
await sippUac('uac-refer-with-notify.xml', '172.38.0.10', from);
t.pass('refer: successfully received 202 Accepted');
await sleepFor(1000);
const obj = await getJSON(`http:127.0.0.1:3100/lastRequest/${from}_actionHook`);
t.ok(obj.body.final_referred_call_status === 200, 'refer: successfully received NOTIFY with 200 OK');
// console.log(`obj: ${JSON.stringify(obj)}`);
disconnect();
} catch (err) {
console.log(`error received: ${err}`);
disconnect();
t.error(err);
}
});
test('\'refer\' tests w/202 but no NOTIFY', {timeout: 25000}, async(t) => {
clearModule.all();
const {srf, disconnect} = require('../app');

View File

@@ -0,0 +1,436 @@
const test = require('tape');
const sinon = require('sinon');
const proxyquire = require("proxyquire");
proxyquire.noCallThru();
const {
JAMBONES_LOGLEVEL,
} = require('../lib/config');
const logger = require('pino')({level: JAMBONES_LOGLEVEL});
// Mock WebSocket specifically for retry testing
class RetryMockWebSocket {
static retryScenarios = new Map();
static connectionAttempts = new Map();
static urlMapping = new Map(); // Maps cleanUrl -> originalUrl
constructor(url, protocols, options) {
this.url = url;
this.protocols = protocols;
this.options = options;
this.eventListeners = new Map();
// Extract scenario key from URL hash or use URL itself
this.scenarioKey = this.extractScenarioKey(url);
// Track connection attempts for this scenario
const attempts = RetryMockWebSocket.connectionAttempts.get(this.scenarioKey) || 0;
RetryMockWebSocket.connectionAttempts.set(this.scenarioKey, attempts + 1);
console.log(`RetryMockWebSocket: constructor for URL ${url}, scenarioKey="${this.scenarioKey}", attempt #${attempts + 1}`);
// Handle connection immediately
setImmediate(() => {
this.handleConnection();
});
}
extractScenarioKey(url) {
console.log(`RetryMockWebSocket: extractScenarioKey from URL: ${url}`);
// Check if we have a mapping from cleanUrl to originalUrl
const originalUrl = RetryMockWebSocket.urlMapping.get(url);
if (originalUrl && originalUrl.includes('#')) {
const hash = originalUrl.split('#')[1];
console.log(`RetryMockWebSocket: found mapped URL with hash: ${hash}`);
return hash;
}
// For URLs with hash parameters, use the hash as the scenario key
if (url.includes('#')) {
const hash = url.split('#')[1];
console.log(`RetryMockWebSocket: found hash: ${hash}`);
return hash; // Use hash as scenario key
}
console.log(`RetryMockWebSocket: using full URL as scenario key: ${url}`);
return url; // Fallback to full URL
}
static setRetryScenario(key, scenario) {
console.log(`RetryMockWebSocket: setting scenario for key "${key}":`, scenario);
RetryMockWebSocket.retryScenarios.set(key, scenario);
}
static setUrlMapping(cleanUrl, originalUrl) {
console.log(`RetryMockWebSocket: mapping ${cleanUrl} -> ${originalUrl}`);
RetryMockWebSocket.urlMapping.set(cleanUrl, originalUrl);
}
static clearScenarios() {
console.log('RetryMockWebSocket: clearing all scenarios');
RetryMockWebSocket.retryScenarios.clear();
RetryMockWebSocket.connectionAttempts.clear();
RetryMockWebSocket.urlMapping.clear();
}
static getConnectionAttempts(key) {
return RetryMockWebSocket.connectionAttempts.get(key) || 0;
}
handleConnection() {
const scenario = RetryMockWebSocket.retryScenarios.get(this.scenarioKey);
console.log(`RetryMockWebSocket: handleConnection for scenarioKey="${this.scenarioKey}", scenario found:`, !!scenario);
if (!scenario) {
// Default successful connection
console.log(`RetryMockWebSocket: no scenario found, defaulting to success`);
this.simulateOpen();
return;
}
const attemptNumber = RetryMockWebSocket.connectionAttempts.get(this.scenarioKey);
const behavior = scenario.attempts[attemptNumber - 1] || scenario.attempts[scenario.attempts.length - 1];
console.log(`RetryMockWebSocket: attempt ${attemptNumber}, behavior:`, behavior);
if (behavior.type === 'handshake-failure') {
// Simulate handshake failure with specific status code
setImmediate(() => {
console.log(`RetryMockWebSocket: triggering handshake failure with status ${behavior.statusCode}`);
if (this.eventListeners.has('unexpected-response')) {
const mockResponse = {
statusCode: behavior.statusCode || 500,
statusMessage: behavior.statusMessage || 'Internal Server Error',
headers: {}
};
const mockRequest = {
headers: {}
};
this.eventListeners.get('unexpected-response')(mockRequest, mockResponse);
}
});
} else if (behavior.type === 'network-error') {
// Simulate network error during connection
setImmediate(() => {
console.log(`RetryMockWebSocket: triggering network error: ${behavior.message}`);
if (this.eventListeners.has('error')) {
const error = new Error(behavior.message || 'Network error');
// Set proper error code for retry policy checking
if (behavior.message && behavior.message.includes('Connection refused')) {
error.code = 'ECONNREFUSED';
} else if (behavior.message && behavior.message.includes('timeout')) {
error.code = 'ETIMEDOUT';
} else {
error.code = 'ECONNREFUSED'; // Default for network errors
}
this.eventListeners.get('error')(error);
}
});
} else if (behavior.type === 'success') {
// Successful connection
console.log(`RetryMockWebSocket: triggering success`);
this.simulateOpen();
}
}
simulateOpen() {
setImmediate(() => {
if (this.eventListeners.has('open')) {
console.log(`RetryMockWebSocket: calling open listener`);
this.eventListeners.get('open')();
}
});
}
once(event, listener) {
console.log(`RetryMockWebSocket: registering once listener for ${event}`);
this.eventListeners.set(event, listener);
return this;
}
on(event, listener) {
console.log(`RetryMockWebSocket: registering on listener for ${event}`);
this.eventListeners.set(event, listener);
return this;
}
removeAllListeners() {
this.eventListeners.clear();
}
send(data, callback) {
// For successful connections, simulate message response
try {
const json = JSON.parse(data);
console.log({json}, 'RetryMockWebSocket: got message from ws-requestor');
// Simulate successful response
setTimeout(() => {
const msg = {
type: 'ack',
msgid: json.msgid,
command: 'command',
call_sid: json.call_sid,
queueCommand: false,
data: '[{"verb": "play","url": "silence_stream://5000"}]'
};
console.log({msg}, 'RetryMockWebSocket: sending ack to ws-requestor');
this.mockOnMessage(JSON.stringify(msg));
}, 50);
if (callback) callback();
} catch (err) {
console.error('RetryMockWebSocket: Error processing send', err);
if (callback) callback(err);
}
}
mockOnMessage(message, isBinary = false) {
if (this.eventListeners.has('message')) {
this.eventListeners.get('message')(message, isBinary);
}
}
close(code) {
if (this.eventListeners.has('close')) {
this.eventListeners.get('close')(code || 1000);
}
}
}
const BaseRequestor = proxyquire('../lib/utils/base-requestor', {
'../../': {
srf: {
locals: {
stats: {
histogram: () => {},
},
},
},
},
'@jambonz/time-series': sinon.stub(),
});
const WsRequestor = proxyquire('../lib/utils/ws-requestor', {
'./base-requestor': BaseRequestor,
ws: RetryMockWebSocket,
});
test('ws retry policy - 4xx error with rp=5xx should not retry', async(t) => {
// GIVEN
console.log('Starting test setup...');
RetryMockWebSocket.clearScenarios();
const call_sid = 'ws_no_retry_4xx';
// Set up the URL mapping
const cleanUrl = 'ws://localhost:3000';
const originalUrl = 'ws://localhost:3000#rc=2&rp=5xx';
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
// Set up the retry scenario for the first attempt to fail with 400, but policy only retries 5xx
RetryMockWebSocket.setRetryScenario('rc=2&rp=5xx', {
attempts: [
{ type: 'handshake-failure', statusCode: 400, statusMessage: 'Bad Request' }
]
});
const hook = {
url: 'ws://localhost:3000#rc=2&rp=5xx', // Max 2 retries, retry only on 5xx
username: 'username',
password: 'password',
};
const params = {
callSid: call_sid,
};
// WHEN
const requestor = new WsRequestor(
logger,
'account_sid',
hook,
'webhook_secret'
);
try {
const result = await requestor.request('session:new', hook, params, {});
t.fail('Should have thrown an error');
t.end();
} catch (err) {
// THEN
const errorMessage = err.message || err.toString() || String(err);
t.ok(
errorMessage.includes('400'),
`ws properly failed without retry for 4xx when rp=5xx - error: ${errorMessage}`
);
t.end();
}
});
test('ws retry policy - 5xx error with rp=5xx should retry and succeed', async(t) => {
// GIVEN
console.log('Starting 5xx retry test setup...');
RetryMockWebSocket.clearScenarios();
const call_sid = 'ws_retry_5xx_success';
// Set up the URL mapping
const cleanUrl = 'ws://localhost:3000';
const originalUrl = 'ws://localhost:3000#rc=2&rp=5xx';
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
// Set up the retry scenario - first attempt fails with 500, second succeeds
RetryMockWebSocket.setRetryScenario('rc=2&rp=5xx', {
attempts: [
{ type: 'handshake-failure', statusCode: 500, statusMessage: 'Internal Server Error' },
{ type: 'success' }
]
});
const hook = {
url: 'ws://localhost:3000#rc=2&rp=5xx', // Max 2 retries, retry only on 5xx
username: 'username',
password: 'password',
};
const params = {
callSid: call_sid,
};
// WHEN
const requestor = new WsRequestor(
logger,
'account_sid',
hook,
'webhook_secret'
);
try {
const result = await requestor.request('session:new', hook, params, {});
// THEN
t.ok(result, 'ws successfully retried and connected after 5xx error');
// Verify that exactly 2 attempts were made
const attempts = RetryMockWebSocket.getConnectionAttempts('rc=2&rp=5xx');
t.equal(attempts, 2, 'Should have made exactly 2 connection attempts');
t.end();
} catch (err) {
t.fail(`Should have succeeded after retry - error: ${err.message}`);
t.end();
}
});
test('ws retry policy - network error with rp=ct should retry and succeed', async(t) => {
// GIVEN
console.log('Starting network error retry test setup...');
RetryMockWebSocket.clearScenarios();
const call_sid = 'ws_retry_network_success';
// Set up the URL mapping
const cleanUrl = 'ws://localhost:3000';
const originalUrl = 'ws://localhost:3000#rc=3&rp=ct';
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
// Set up the retry scenario - first two attempts fail with network error, third succeeds
RetryMockWebSocket.setRetryScenario('rc=3&rp=ct', {
attempts: [
{ type: 'network-error', message: 'Connection refused' },
{ type: 'network-error', message: 'Connection refused' },
{ type: 'success' }
]
});
const hook = {
url: 'ws://localhost:3000#rc=3&rp=ct', // Max 3 retries, retry on connection errors
username: 'username',
password: 'password',
};
const params = {
callSid: call_sid,
};
// WHEN
const requestor = new WsRequestor(
logger,
'account_sid',
hook,
'webhook_secret'
);
try {
const result = await requestor.request('session:new', hook, params, {});
// THEN
t.ok(result, 'ws successfully retried and connected after network errors');
// Verify that exactly 3 attempts were made
const attempts = RetryMockWebSocket.getConnectionAttempts('rc=3&rp=ct');
t.equal(attempts, 3, 'Should have made exactly 3 connection attempts');
t.end();
} catch (err) {
t.fail(`Should have succeeded after retry - error: ${err.message}`);
t.end();
}
});
test('ws retry policy - retry exhaustion should fail with last error', async(t) => {
// GIVEN
console.log('Starting retry exhaustion test setup...');
RetryMockWebSocket.clearScenarios();
const call_sid = 'ws_retry_exhaustion';
// Set up the URL mapping
const cleanUrl = 'ws://localhost:3000';
const originalUrl = 'ws://localhost:3000#rc=2&rp=5xx';
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
// Set up the retry scenario - all attempts fail with 500
RetryMockWebSocket.setRetryScenario('rc=2&rp=5xx', {
attempts: [
{ type: 'handshake-failure', statusCode: 500, statusMessage: 'Internal Server Error' },
{ type: 'handshake-failure', statusCode: 500, statusMessage: 'Internal Server Error' },
{ type: 'handshake-failure', statusCode: 500, statusMessage: 'Internal Server Error' }
]
});
const hook = {
url: 'ws://localhost:3000#rc=2&rp=5xx', // Max 2 retries, retry only on 5xx
username: 'username',
password: 'password',
};
const params = {
callSid: call_sid,
};
// WHEN
const requestor = new WsRequestor(
logger,
'account_sid',
hook,
'webhook_secret'
);
try {
const result = await requestor.request('session:new', hook, params, {});
t.fail('Should have thrown an error after exhausting retries');
t.end();
} catch (err) {
// THEN
const errorMessage = err.message || err.toString() || String(err);
t.ok(
errorMessage.includes('500'),
`ws properly failed after exhausting retries - error: ${errorMessage}`
);
// Verify that exactly 3 attempts were made (initial + 2 retries)
const attempts = RetryMockWebSocket.getConnectionAttempts('rc=2&rp=5xx');
t.equal(attempts, 3, 'Should have made exactly 3 connection attempts (initial + 2 retries)');
t.end();
}
});

103
test/utils/mock-helper.js Normal file
View File

@@ -0,0 +1,103 @@
const sinon = require('sinon');
/**
* Creates mock objects commonly needed for testing HttpRequestor and related classes
* @returns {Object} Mock objects
*/
const createMocks = () => {
// Basic logger mock
const MockLogger = {
debug: () => {},
info: () => {},
error: () => {}
};
// Stats mock
const MockStats = {
histogram: () => {}
};
// Alerter mock
const MockAlerter = {
AlertType: {
WEBHOOK_CONNECTION_FAILURE: 'WEBHOOK_CONNECTION_FAILURE',
WEBHOOK_STATUS_FAILURE: 'WEBHOOK_STATUS_FAILURE'
},
writeAlerts: async () => {}
};
// DB helpers mock
const MockDbHelpers = {
pool: {
getConnection: () => Promise.resolve({
connect: () => {},
on: () => {},
query: (sql, cb) => {
if (typeof cb === 'function') cb(null, []);
return { stream: () => ({ on: () => {} }) };
},
end: () => {}
}),
query: (...args) => {
const cb = args[args.length - 1];
if (typeof cb === 'function') cb(null, []);
return Promise.resolve([]);
}
},
camelize: (obj) => obj
};
// Time series mock
const MockTimeSeries = () => ({
writeAlerts: async () => {},
AlertType: {
WEBHOOK_CONNECTION_FAILURE: 'WEBHOOK_CONNECTION_FAILURE',
WEBHOOK_STATUS_FAILURE: 'WEBHOOK_STATUS_FAILURE'
}
});
return {
MockLogger,
MockStats,
MockAlerter,
MockDbHelpers,
MockTimeSeries
};
};
/**
* Set up mocks on the BaseRequestor class for tests
* @param {Object} BaseRequestor - The BaseRequestor class
*/
const setupBaseRequestorMocks = (BaseRequestor) => {
BaseRequestor.prototype._isAbsoluteUrl = function(url) { return url.startsWith('http'); };
BaseRequestor.prototype._isRelativeUrl = function(url) { return !url.startsWith('http'); };
BaseRequestor.prototype._generateSigHeader = function() { return { 'X-Signature': 'test-signature' }; };
BaseRequestor.prototype._roundTrip = function() { return 10; };
// Define baseUrl property
Object.defineProperty(BaseRequestor.prototype, 'baseUrl', {
get: function() { return 'http://localhost'; }
});
// Define Alerter property
const mocks = createMocks();
Object.defineProperty(BaseRequestor.prototype, 'Alerter', {
get: function() { return mocks.MockAlerter; }
});
};
/**
* Clean up after tests
* @param {Object} requestor - The requestor instance to clean up
*/
const cleanup = (requestor) => {
sinon.restore();
if (requestor && requestor.close) requestor.close();
};
module.exports = {
createMocks,
setupBaseRequestorMocks,
cleanup
};

154
test/utils/test-mocks.js Normal file
View File

@@ -0,0 +1,154 @@
/**
* Common test mocks for Jambonz tests
*/
const proxyquire = require('proxyquire').noCallThru();
// Logger mock
class MockLogger {
debug() {}
info() {}
error() {}
}
// Stats mock
const statsMock = { histogram: () => {} };
// Time series mock
const timeSeriesMock = () => ({
writeAlerts: async () => {},
AlertType: {
WEBHOOK_CONNECTION_FAILURE: 'WEBHOOK_CONNECTION_FAILURE',
WEBHOOK_STATUS_FAILURE: 'WEBHOOK_STATUS_FAILURE'
}
});
// DB helpers mock
const dbHelpersMock = {
pool: {
getConnection: () => Promise.resolve({
connect: () => {},
on: () => {},
query: (sql, cb) => {
if (typeof cb === 'function') cb(null, []);
return { stream: () => ({ on: () => {} }) };
},
end: () => {}
}),
query: (...args) => {
const cb = args[args.length - 1];
if (typeof cb === 'function') cb(null, []);
return Promise.resolve([]);
}
},
camelize: (obj) => obj
};
// Config mock
const configMock = {
HTTP_POOL: '0',
HTTP_POOLSIZE: '10',
HTTP_PIPELINING: '1',
HTTP_TIMEOUT: 5000,
HTTP_PROXY_IP: null,
HTTP_PROXY_PORT: null,
HTTP_PROXY_PROTOCOL: null,
NODE_ENV: 'test',
HTTP_USER_AGENT_HEADER: 'test-agent',
JAMBONES_TIME_SERIES_HOST: 'localhost'
};
// SRF mock
const srfMock = {
srf: {
locals: {
stats: statsMock
}
}
};
// Alerter mock
const alerterMock = {
AlertType: {
WEBHOOK_CONNECTION_FAILURE: 'WEBHOOK_CONNECTION_FAILURE',
WEBHOOK_STATUS_FAILURE: 'WEBHOOK_STATUS_FAILURE'
},
writeAlerts: async () => {}
};
/**
* Creates mocked BaseRequestor and HttpRequestor classes
* @returns {Object} Mocked classes and helper functions
*/
function createMockedRequestors() {
// First, mock BaseRequestor's dependencies
const BaseRequestor = proxyquire('../../lib/utils/base-requestor', {
'@jambonz/time-series': timeSeriesMock,
'../config': configMock,
'../../': srfMock
});
// Apply prototype methods and properties
BaseRequestor.prototype._isAbsoluteUrl = function(url) { return url.startsWith('http'); };
BaseRequestor.prototype._isRelativeUrl = function(url) { return !url.startsWith('http'); };
BaseRequestor.prototype._generateSigHeader = function() { return { 'X-Signature': 'test-signature' }; };
BaseRequestor.prototype._roundTrip = function() { return 10; };
// Define baseUrl property
Object.defineProperty(BaseRequestor.prototype, 'baseUrl', {
get: function() { return 'http://localhost'; }
});
// Define Alerter property
Object.defineProperty(BaseRequestor.prototype, 'Alerter', {
get: function() { return alerterMock; }
});
// Then mock HttpRequestor with the mocked BaseRequestor
const HttpRequestor = proxyquire('../../lib/utils/http-requestor', {
'./base-requestor': BaseRequestor,
'../config': configMock,
'@jambonz/db-helpers': dbHelpersMock
});
// Setup function to create a clean requestor for each test
const setupRequestor = () => {
const logger = new MockLogger();
const hook = { url: 'http://localhost/test', method: 'POST' };
const secret = 'testsecret';
return new HttpRequestor(logger, 'AC123', hook, secret);
};
// Cleanup function
const cleanup = (requestor) => {
const sinon = require('sinon');
sinon.restore();
if (requestor && requestor.close) requestor.close();
};
return {
BaseRequestor,
HttpRequestor,
setupRequestor,
cleanup,
mocks: {
logger: new MockLogger(),
stats: statsMock,
timeSeries: timeSeriesMock,
dbHelpers: dbHelpersMock,
config: configMock,
srf: srfMock,
alerter: alerterMock
}
};
}
module.exports = {
createMockedRequestors,
MockLogger,
statsMock,
timeSeriesMock,
dbHelpersMock,
configMock,
srfMock,
alerterMock
};

View File

@@ -99,6 +99,24 @@ app.post('/actionHook', (req, res) => {
return res.sendStatus(200);
});
/*
* referHook
*/
app.post('/referHook', (req, res) => {
console.log({payload: req.body}, 'POST /referHook');
let key = req.body.from + "_referHook"
addRequestToMap(key, req, hook_mapping);
return res.json([{"verb": "pause", "length": 2}]);
});
/*
* adultingHook
*/
app.post('/adulting', (req, res) => {
console.log({payload: req.body}, 'POST /adulting');
return res.sendStatus(200);
});
/*
* customHook
* For the hook to return

View File

@@ -83,7 +83,8 @@ test('invalid jambonz json create alert tests', async(t) => {
{account_sid: 'bb845d4b-83a9-4cde-a6e9-50f3743bab3f', page: 1, page_size: 25, days: 7});
let checked = false;
for (let i = 0; i < data.total; i++) {
checked = data.data[i].message === 'malformed jambonz payload: must be array'
checked = data.data[i].message === 'malformed jambonz payload: must be array';
if (checked) break;
}
t.ok(checked, 'alert is raised as expected');
disconnect();

View File

@@ -0,0 +1,605 @@
const test = require('tape');
const sinon = require('sinon');
const proxyquire = require("proxyquire");
proxyquire.noCallThru();
const {
JAMBONES_LOGLEVEL,
} = require('../lib/config');
const logger = require('pino')({level: JAMBONES_LOGLEVEL});
// Mock WebSocket specifically for retry testing
class RetryMockWebSocket {
static retryScenarios = new Map();
static connectionAttempts = new Map();
static urlMapping = new Map(); // Maps cleanUrl -> originalUrl
constructor(url, protocols, options) {
this.url = url;
this.protocols = protocols;
this.options = options;
this.eventListeners = new Map();
// Extract scenario key from URL hash or use URL itself
this.scenarioKey = this.extractScenarioKey(url);
// Track connection attempts for this scenario
const attempts = RetryMockWebSocket.connectionAttempts.get(this.scenarioKey) || 0;
RetryMockWebSocket.connectionAttempts.set(this.scenarioKey, attempts + 1);
// Handle connection immediately
setImmediate(() => {
this.handleConnection();
});
}
extractScenarioKey(url) {
console.log(`RetryMockWebSocket: extractScenarioKey from URL: ${url}`);
// Check if we have a mapping from cleanUrl to originalUrl
const originalUrl = RetryMockWebSocket.urlMapping.get(url);
if (originalUrl && originalUrl.includes('#')) {
const hash = originalUrl.split('#')[1];
console.log(`RetryMockWebSocket: found mapped URL with hash: ${hash}`);
return hash;
}
// For URLs with hash parameters, use the hash as the scenario key
if (url.includes('#')) {
const hash = url.split('#')[1];
console.log(`RetryMockWebSocket: found hash: ${hash}`);
return hash; // Use hash as scenario key
}
console.log(`RetryMockWebSocket: using full URL as scenario key: ${url}`);
return url; // Fallback to full URL
}
static setRetryScenario(key, scenario) {
RetryMockWebSocket.retryScenarios.set(key, scenario);
}
static setUrlMapping(cleanUrl, originalUrl) {
RetryMockWebSocket.urlMapping.set(cleanUrl, originalUrl);
}
static clearScenarios() {
RetryMockWebSocket.retryScenarios.clear();
RetryMockWebSocket.connectionAttempts.clear();
RetryMockWebSocket.urlMapping.clear();
}
static getConnectionAttempts(key) {
return RetryMockWebSocket.connectionAttempts.get(key) || 0;
}
handleConnection() {
const scenario = RetryMockWebSocket.retryScenarios.get(this.scenarioKey);
console.log(`RetryMockWebSocket: handleConnection for scenarioKey="${this.scenarioKey}", scenario found:`, !!scenario);
if (!scenario) {
// Default successful connection
this.simulateOpen();
return;
}
const attemptNumber = RetryMockWebSocket.connectionAttempts.get(this.scenarioKey);
const behavior = scenario.attempts[attemptNumber - 1] || scenario.attempts[scenario.attempts.length - 1];
console.log(`RetryMockWebSocket: attempt ${attemptNumber}, behavior:`, behavior);
if (behavior.type === 'handshake-failure') {
// Simulate handshake failure with specific status code
setImmediate(() => {
console.log(`RetryMockWebSocket: triggering handshake failure with status ${behavior.statusCode}`);
if (this.eventListeners.has('unexpected-response')) {
const mockResponse = {
statusCode: behavior.statusCode || 500,
statusMessage: behavior.statusMessage || 'Internal Server Error',
headers: {}
};
const mockRequest = {
headers: {}
};
this.eventListeners.get('unexpected-response')(mockRequest, mockResponse);
}
});
} else if (behavior.type === 'network-error') {
// Simulate network error during connection
setImmediate(() => {
console.log(`RetryMockWebSocket: triggering network error: ${behavior.message}`);
if (this.eventListeners.has('error')) {
const err = new Error(behavior.message || 'Network error');
// Set appropriate error codes based on the message
if (behavior.message === 'Connection timeout') {
err.code = 'ETIMEDOUT';
} else if (behavior.message === 'Connection refused') {
err.code = 'ECONNREFUSED';
} else if (behavior.message === 'Connection reset') {
err.code = 'ECONNRESET';
} else {
// Default to ECONNREFUSED for generic network errors
err.code = 'ECONNREFUSED';
}
this.eventListeners.get('error')(err);
}
});
} else if (behavior.type === 'success') {
// Successful connection
console.log(`RetryMockWebSocket: triggering success`);
this.simulateOpen();
}
}
simulateOpen() {
setImmediate(() => {
if (this.eventListeners.has('open')) {
this.eventListeners.get('open')();
}
});
}
once(event, listener) {
this.eventListeners.set(event, listener);
return this;
}
on(event, listener) {
this.eventListeners.set(event, listener);
return this;
}
removeAllListeners() {
this.eventListeners.clear();
}
send(data, callback) {
// For successful connections, simulate message response
try {
const json = JSON.parse(data);
console.log({json}, 'RetryMockWebSocket: got message from ws-requestor');
// Simulate successful response
setTimeout(() => {
const msg = {
type: 'ack',
msgid: json.msgid,
command: 'command',
call_sid: json.call_sid,
queueCommand: false,
data: '[{"verb": "play","url": "silence_stream://5000"}]'
};
console.log({msg}, 'RetryMockWebSocket: sending ack to ws-requestor');
this.mockOnMessage(JSON.stringify(msg));
}, 50);
if (callback) callback();
} catch (err) {
console.error('RetryMockWebSocket: Error processing send', err);
if (callback) callback(err);
}
}
mockOnMessage(message, isBinary = false) {
if (this.eventListeners.has('message')) {
this.eventListeners.get('message')(message, isBinary);
}
}
close(code) {
if (this.eventListeners.has('close')) {
this.eventListeners.get('close')(code || 1000);
}
}
}
const BaseRequestor = proxyquire(
"../lib/utils/base-requestor",
{
"../../": {
srf: {
locals: {
stats: {
histogram: () => {}
}
}
}
},
"@jambonz/time-series": sinon.stub()
}
);
const WsRequestor = proxyquire(
"../lib/utils/ws-requestor",
{
"./base-requestor": BaseRequestor,
"ws": RetryMockWebSocket
}
);
test('WS Retry - 4xx error with rp=4xx should retry and succeed', async (t) => {
// GIVEN
RetryMockWebSocket.clearScenarios();
const originalUrl = 'ws://localhost:3000#rc=2&rp=4xx';
const cleanUrl = 'ws://localhost:3000';
// Set up URL mapping so mock can find the right scenario
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
const retryScenario = {
attempts: [
{ type: 'handshake-failure', statusCode: 400, statusMessage: 'Bad Request' },
{ type: 'success' }
]
};
RetryMockWebSocket.setRetryScenario('rc=2&rp=4xx', retryScenario);
const hook = {
url: originalUrl,
username: 'username',
password: 'password'
};
const params = {
callSid: 'test_4xx_retry'
};
// WHEN
const requestor = new WsRequestor(logger, "account_sid", hook, "webhook_secret");
const result = await requestor.request('session:new', hook, params, {});
// THEN
t.ok(result, 'ws successfully retried after 4xx error and got response');
t.equal(RetryMockWebSocket.getConnectionAttempts('rc=2&rp=4xx'), 2, 'should have made 2 connection attempts');
t.end();
});
test('WS Retry - 4xx error with rp=5xx should not retry', async (t) => {
// GIVEN
RetryMockWebSocket.clearScenarios();
const originalUrl = 'ws://localhost:3000#rc=2&rp=5xx';
const cleanUrl = 'ws://localhost:3000';
// Set up URL mapping so mock can find the right scenario
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
const retryScenario = {
attempts: [
{ type: 'handshake-failure', statusCode: 400, statusMessage: 'Bad Request' }
]
};
RetryMockWebSocket.setRetryScenario('rc=2&rp=5xx', retryScenario);
const hook = {
url: originalUrl,
username: 'username',
password: 'password'
};
const params = {
callSid: 'test_4xx_no_retry'
};
// WHEN & THEN
const requestor = new WsRequestor(logger, "account_sid", hook, "webhook_secret");
try {
await requestor.request('session:new', hook, params, {});
t.fail('Should have thrown an error');
} catch (err) {
const errorMessage = err.message || err.toString() || String(err);
t.ok(errorMessage.includes('400'), 'ws properly failed without retry for 4xx when rp=5xx');
t.equal(RetryMockWebSocket.getConnectionAttempts('rc=2&rp=5xx'), 1, 'should have made only 1 connection attempt');
t.end();
}
});
test('WS Retry - 5xx error with rp=5xx should retry and succeed', async (t) => {
// GIVEN
RetryMockWebSocket.clearScenarios();
const originalUrl = 'ws://localhost:3000#rc=2&rp=5xx';
const cleanUrl = 'ws://localhost:3000';
// Set up URL mapping so mock can find the right scenario
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
const retryScenario = {
attempts: [
{ type: 'handshake-failure', statusCode: 503, statusMessage: 'Service Unavailable' },
{ type: 'success' }
]
};
RetryMockWebSocket.setRetryScenario('rc=2&rp=5xx', retryScenario);
const hook = {
url: originalUrl,
username: 'username',
password: 'password'
};
const params = {
callSid: 'test_5xx_retry'
};
// WHEN
const requestor = new WsRequestor(logger, "account_sid", hook, "webhook_secret");
const result = await requestor.request('session:new', hook, params, {});
// THEN
t.ok(result, 'ws successfully retried after 5xx error and got response');
t.equal(RetryMockWebSocket.getConnectionAttempts('rc=2&rp=5xx'), 2, 'should have made 2 connection attempts');
t.end();
});
test('WS Retry - 5xx error with rp=4xx should not retry', async (t) => {
// GIVEN
RetryMockWebSocket.clearScenarios();
const originalUrl = 'ws://localhost:3000#rc=2&rp=4xx';
const cleanUrl = 'ws://localhost:3000';
// Set up URL mapping so mock can find the right scenario
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
const retryScenario = {
attempts: [
{ type: 'handshake-failure', statusCode: 503, statusMessage: 'Service Unavailable' }
]
};
RetryMockWebSocket.setRetryScenario('rc=2&rp=4xx', retryScenario);
const hook = {
url: originalUrl,
username: 'username',
password: 'password'
};
const params = {
callSid: 'test_5xx_no_retry'
};
// WHEN & THEN
const requestor = new WsRequestor(logger, "account_sid", hook, "webhook_secret");
try {
await requestor.request('session:new', hook, params, {});
t.fail('Should have thrown an error');
} catch (err) {
const errorMessage = err.message || err.toString() || String(err);
t.ok(errorMessage.includes('503'), 'ws properly failed without retry for 5xx when rp=4xx');
t.equal(RetryMockWebSocket.getConnectionAttempts('rc=2&rp=4xx'), 1, 'should have made only 1 connection attempt');
t.end();
}
});
test('WS Retry - network error with rp=all should retry and succeed', async (t) => {
// GIVEN
RetryMockWebSocket.clearScenarios();
const originalUrl = 'ws://localhost:3000#rc=2&rp=all';
const cleanUrl = 'ws://localhost:3000';
// Set up URL mapping so mock can find the right scenario
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
const retryScenario = {
attempts: [
{ type: 'network-error', message: 'Connection refused' },
{ type: 'success' }
]
};
RetryMockWebSocket.setRetryScenario('rc=2&rp=all', retryScenario);
const hook = {
url: originalUrl,
username: 'username',
password: 'password'
};
const params = {
callSid: 'test_network_retry'
};
// WHEN
const requestor = new WsRequestor(logger, "account_sid", hook, "webhook_secret");
const result = await requestor.request('session:new', hook, params, {});
// THEN
t.ok(result, 'ws successfully retried after network error and got response');
t.equal(RetryMockWebSocket.getConnectionAttempts('rc=2&rp=all'), 2, 'should have made 2 connection attempts');
t.end();
});
test('WS Retry - network error with rp=4xx should not retry', async (t) => {
// GIVEN
RetryMockWebSocket.clearScenarios();
const originalUrl = 'ws://localhost:3000#rc=2&rp=4xx';
const cleanUrl = 'ws://localhost:3000';
// Set up URL mapping so mock can find the right scenario
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
const retryScenario = {
attempts: [
{ type: 'network-error', message: 'Connection refused' }
]
};
RetryMockWebSocket.setRetryScenario('rc=2&rp=4xx', retryScenario);
const hook = {
url: originalUrl,
username: 'username',
password: 'password'
};
const params = {
callSid: 'test_network_no_retry'
};
// WHEN & THEN
const requestor = new WsRequestor(logger, "account_sid", hook, "webhook_secret");
try {
await requestor.request('session:new', hook, params, {});
t.fail('Should have thrown an error');
} catch (err) {
const errorMessage = err.message || err.toString() || String(err);
t.ok(errorMessage.includes('Connection refused') || errorMessage.includes('Error'),
'ws properly failed without retry for network error when rp=4xx');
t.equal(RetryMockWebSocket.getConnectionAttempts('rc=2&rp=4xx'), 1, 'should have made only 1 connection attempt');
t.end();
}
});
test('WS Retry - multiple retries then success', async (t) => {
// GIVEN
RetryMockWebSocket.clearScenarios();
const originalUrl = 'ws://localhost:3000#rc=4&rp=all';
const cleanUrl = 'ws://localhost:3000';
// Set up URL mapping so mock can find the right scenario
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
const retryScenario = {
attempts: [
{ type: 'handshake-failure', statusCode: 503, statusMessage: 'Service Unavailable' },
{ type: 'network-error', message: 'Connection timeout' },
{ type: 'handshake-failure', statusCode: 502, statusMessage: 'Bad Gateway' },
{ type: 'success' }
]
};
RetryMockWebSocket.setRetryScenario('rc=4&rp=all', retryScenario);
const hook = {
url: originalUrl,
username: 'username',
password: 'password'
};
const params = {
callSid: 'test_multiple_retries'
};
// WHEN
const requestor = new WsRequestor(logger, "account_sid", hook, "webhook_secret");
const result = await requestor.request('session:new', hook, params, {});
// THEN
t.ok(result, 'ws successfully retried multiple times and got response');
t.equal(RetryMockWebSocket.getConnectionAttempts('rc=4&rp=all'), 4, 'should have made 4 connection attempts');
t.end();
});
test('WS Retry - exhaust retries and fail', async (t) => {
// GIVEN
RetryMockWebSocket.clearScenarios();
const originalUrl = 'ws://localhost:3000#rc=2&rp=5xx';
const cleanUrl = 'ws://localhost:3000';
// Set up URL mapping so mock can find the right scenario
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
const retryScenario = {
attempts: [
{ type: 'handshake-failure', statusCode: 503, statusMessage: 'Service Unavailable' },
{ type: 'handshake-failure', statusCode: 503, statusMessage: 'Service Unavailable' },
{ type: 'handshake-failure', statusCode: 503, statusMessage: 'Service Unavailable' }
]
};
RetryMockWebSocket.setRetryScenario('rc=2&rp=5xx', retryScenario);
const hook = {
url: originalUrl,
username: 'username',
password: 'password'
};
const params = {
callSid: 'test_exhaust_retries'
};
// WHEN & THEN
const requestor = new WsRequestor(logger, "account_sid", hook, "webhook_secret");
try {
await requestor.request('session:new', hook, params, {});
t.fail('Should have thrown an error');
} catch (err) {
const errorMessage = err.message || err.toString() || String(err);
t.ok(errorMessage.includes('503'), 'ws properly failed after exhausting retries');
t.equal(RetryMockWebSocket.getConnectionAttempts('rc=2&rp=5xx'), 3, 'should have made 3 connection attempts (initial + 2 retries)');
t.end();
}
});
test('WS Retry - rp=ct (connection timeout) should retry network errors', async (t) => {
// GIVEN
RetryMockWebSocket.clearScenarios();
const originalUrl = 'ws://localhost:3000#rc=2&rp=ct';
const cleanUrl = 'ws://localhost:3000';
// Set up URL mapping so mock can find the right scenario
RetryMockWebSocket.setUrlMapping(cleanUrl, originalUrl);
const retryScenario = {
attempts: [
{ type: 'network-error', message: 'Connection timeout' },
{ type: 'success' }
]
};
RetryMockWebSocket.setRetryScenario('rc=2&rp=ct', retryScenario);
const hook = {
url: originalUrl,
username: 'username',
password: 'password'
};
const params = {
callSid: 'test_ct_retry'
};
// WHEN
const requestor = new WsRequestor(logger, "account_sid", hook, "webhook_secret");
const result = await requestor.request('session:new', hook, params, {});
// THEN
t.ok(result, 'ws successfully retried connection timeout and got response');
t.equal(RetryMockWebSocket.getConnectionAttempts('rc=2&rp=ct'), 2, 'should have made 2 connection attempts');
t.end();
});
test('WS Retry - default behavior (no hash params) should use ct policy', async (t) => {
// GIVEN
RetryMockWebSocket.clearScenarios();
const retryScenario = {
attempts: [
{ type: 'network-error', message: 'Connection refused' },
{ type: 'success' }
]
};
RetryMockWebSocket.setRetryScenario('ws://localhost:3000', retryScenario);
const hook = {
url: 'ws://localhost:3000', // No hash parameters - should default to ct policy
username: 'username',
password: 'password'
};
const params = {
callSid: 'test_default_policy'
};
// WHEN
const requestor = new WsRequestor(logger, "account_sid", hook, "webhook_secret");
const result = await requestor.request('session:new', hook, params, {});
// THEN
t.ok(result, 'ws successfully retried with default ct policy and got response');
t.equal(RetryMockWebSocket.getConnectionAttempts('ws://localhost:3000'), 2, 'should have made 2 connection attempts');
t.end();
});

View File

@@ -127,7 +127,8 @@ test('ws response error 1000', async (t) => {
}
catch (err) {
// THEN
t.ok(err.startsWith('timeout from far end for msgid'), 'ws does not reconnect if far end closes gracefully');
t.ok(err && (typeof err === 'string' || err instanceof Error),
'ws does not reconnect if far end closes gracefully');
t.end();
}
});
@@ -161,7 +162,8 @@ test('ws response error', async (t) => {
}
catch (err) {
// THEN
t.ok(err.startsWith('timeout from far end for msgid'), 'ws does not reconnect if far end closes gracefully');
t.ok(err && (typeof err === 'string' || err instanceof Error),
'ws error should be either a string or an Error object');
t.end();
}
});
@@ -195,7 +197,7 @@ test('ws unexpected-response', async (t) => {
}
catch (err) {
// THEN
t.ok(err.code = 'ERR_ASSERTION', 'ws does not reconnect if far end closes gracefully');
t.ok(err, 'ws properly fails on unexpected response');
t.end();
}
});

View File

@@ -25,29 +25,38 @@ module.exports = (serviceName) => {
}),
});
let exporter;
const exporters = [];
if (OTEL_EXPORTER_JAEGER_AGENT_HOST || OTEL_EXPORTER_JAEGER_ENDPOINT) {
exporter = new JaegerExporter();
}
else if (OTEL_EXPORTER_ZIPKIN_URL) {
exporter = new ZipkinExporter({url:OTEL_EXPORTER_ZIPKIN_URL});
}
else {
exporter = new OTLPTraceExporter({
url: OTEL_EXPORTER_COLLECTOR_URL
});
exporters.push(new JaegerExporter());
}
provider.addSpanProcessor(new BatchSpanProcessor(exporter, {
// The maximum queue size. After the size is reached spans are dropped.
maxQueueSize: 100,
// The maximum batch size of every export. It must be smaller or equal to maxQueueSize.
maxExportBatchSize: 10,
// The interval between two consecutive exports
scheduledDelayMillis: 500,
// How long the export can run before it is cancelled
exportTimeoutMillis: 30000,
}));
if (OTEL_EXPORTER_ZIPKIN_URL) {
exporters.push(new ZipkinExporter({url:OTEL_EXPORTER_ZIPKIN_URL}));
}
if (OTEL_EXPORTER_ZIPKIN_URL) {
exporters.push(new ZipkinExporter({url:OTEL_EXPORTER_ZIPKIN_URL}));
}
if (OTEL_EXPORTER_COLLECTOR_URL) {
exporters.push(new OTLPTraceExporter({
url: OTEL_EXPORTER_COLLECTOR_URL
}));
}
exporters.forEach((element) => {
provider.addSpanProcessor(new BatchSpanProcessor(element, {
// The maximum queue size. After the size is reached spans are dropped.
maxQueueSize: 100,
// The maximum batch size of every export. It must be smaller or equal to maxQueueSize.
maxExportBatchSize: 10,
// The interval between two consecutive exports
scheduledDelayMillis: 500,
// How long the export can run before it is cancelled
exportTimeoutMillis: 30000,
}));
});
// Initialize the OpenTelemetry APIs to use the NodeTracerProvider bindings
provider.register();