
Build a UGC Live Streaming App with Amazon IVS: Broadcast Real-Time with Multi-Hosts (Lesson 4.3)
Welcome to Lesson 4.3 in this series where we're looking at building a web based user-generated content live streaming application with Amazon IVS. This entire series is available in video format on the AWS Developers YouTube channel and all of the code related to the sample application used in this series can be viewed on GitHub. Refer to the links at the end of the post for more information.
💡 Note: Make sure that you're familiar with lesson 3 of this course, which covers topics like retrieving stream credentials, accessing user devices, and low-latency broadcasting.
1
token = await auth.user?.getCurrentTokenForStage(stage?.id);
getCurrentTokenForStage()
function runs a query to find the latest token:1
2
3
4
5
6
7
8
9
10
public async getCurrentTokenForStage(stageId: number) {
const user: User = this;
const token = await user
.related('stageTokens')
.query()
.where('stageId', stageId)
.orderBy('expiresAt', 'desc')
.first();
return token;
}
1
2
3
4
5
6
7
8
token = await RealTimeService.createStageToken(auth.user.id.toString(), auth.user?.username, stage?.arn);
await StageToken.create({
participantId: token.participantToken?.participantId,
token: token.participantToken?.token,
userId: Number(token.participantToken?.userId),
expiresAt: DateTime.fromJSDate(token.participantToken?.expirationTime!),
stageId: stage.id,
});
shouldSubscribeToParticipant()
, shouldPublishParticipant()
, and stageStreamsToPublish()
. NONE
, AUDIO_ONLY
, and AUDIO_VIDEO
. When returning a value for this function, the host application does not need to worry about the publish state, current subscription state, or stage connection state. If AUDIO_VIDEO
is returned, the SDK waits until the remote participant is publishing before it subscribes, and it updates the host application by emitting events throughout the process. Because the real-time streams in StreamCat always include full audio and video, we will always return SubscribeType.AUDIO_VIDEO
.1
2
3
4
5
this.stageStrategy = {
shouldSubscribeToParticipant: (participant) => {
return SubscribeType.AUDIO_VIDEO;
}
}
true
.1
2
3
4
5
6
7
8
this.stageStrategy = {
shouldSubscribeToParticipant: (participant) => {
return SubscribeType.AUDIO_VIDEO;
},
shouldPublishParticipant: (participant) => {
return true;
},
}
LocalStageStream
from audio and video streams from the user's microphone and camera (see lesson 3.2 for more on creating streams). 1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
this.stageStrategy = {
shouldSubscribeToParticipant: (participant) => {
return SubscribeType.AUDIO_VIDEO;
},
shouldPublishParticipant: (participant) => {
return true;
},
stageStreamsToPublish: () => {
const videoTrack = this.videoStream.getVideoTracks()[0];
const audioTrack = this.audioStream.getAudioTracks()[0];
const streamsToPublish = [
new LocalStageStream(audioTrack),
new LocalStageStream(videoTrack)
];
return streamsToPublish;
},
}
Stage
object.1
this.stage = new Stage(stageToken.token, this.stageStrategy);
Stage
. StreamCat uses two of these events to update the view when a participant joins or leaves a real-time stream.1
2
3
4
5
6
7
8
9
10
11
12
13
this.stage.on(StageEvents.STAGE_PARTICIPANT_STREAMS_ADDED, async (participant, streams) => {
this.stageParticipants.push(participant);
this.renderParticipant(participant, streams);
await this.renderAudioToClient(
participant,
streams.find((s) => s.streamType === StreamType.AUDIO)
);
await this.renderVideosToClient(
participant,
streams.find((s) => s.streamType === StreamType.VIDEO)
);
await this.updateVideoCompositions();
});
renderParticipant()
function handles adding any streams to the solo participant video (excluding audio for the local participant in order to prevent audio echo for the host).1
2
3
4
5
6
7
8
9
10
11
12
13
14
renderParticipant(participant, streams) {
let streamsToDisplay = streams;
if (participant.isLocal) {
streamsToDisplay = streams.filter((stream) => stream.streamType === StreamType.VIDEO);
}
this.$nextTick(() => {
const localVideo = document.getElementById(`participant-${participant.id}`);
const mediaStream = localVideo.srcObject || new MediaStream();
streamsToDisplay.forEach((stream) => {
mediaStream.addTrack(stream.mediaStreamTrack);
});
localVideo.srcObject = mediaStream;
});
}
renderAudioToClient()
function handles adding the audio stream to the composite view for broadcasting.1
2
3
4
5
6
7
8
9
10
async renderAudioToClient(participant, stream) {
const broadcastClient = Alpine.raw(this.broadcastClient);
if (!stream?.mediaStreamTrack) return;
const participantId = participant.id;
const audioTrackId = `audio-${participantId}`;
const mediaStream = new MediaStream();
mediaStream.addTrack(stream.mediaStreamTrack);
await broadcastClient.addAudioInputDevice(mediaStream, audioTrackId);
return Promise.resolve();
}
renderVideoToClient()
function adds the video stream to the composite view.1
2
3
4
5
6
7
8
9
10
11
12
async renderVideosToClient(participant, stream) {
const broadcastClient = Alpine.raw(this.broadcastClient);
if (!stream?.mediaStreamTrack) return;
const participantId = participant.id;
const videoId = `video-${participantId}`;
const pIdx = this.stageParticipants.findIndex((p) => p.id === participantId);
let config = this.layouts[this.stageParticipants.length - 1][pIdx];
config.index = pIdx + 1;
const mediaStream = new MediaStream([stream.mediaStreamTrack]);
await broadcastClient.addVideoInputDevice(mediaStream, videoId, config);
return Promise.resolve();
}
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
this.stage.on(StageEvents.STAGE_PARTICIPANT_STREAMS_REMOVED, (participant, streams) => {
const broadcastClient = Alpine.raw(this.broadcastClient);
const videoTrackId = `video-${participant.id}`;
const audioTrackId = `audio-${participant.id}`;
const localVideo = document.getElementById(`participant-${participant.id}`);
const mediaStream = localVideo?.srcObject;
streams.forEach((stream) => {
if (broadcastClient.getVideoInputDevice(videoTrackId) && stream.streamType === 'video') {
broadcastClient.removeVideoInputDevice(videoTrackId);
}
if (broadcastClient.getAudioInputDevice(audioTrackId) && stream.streamType === 'audio') {
broadcastClient.removeAudioInputDevice(audioTrackId);
}
mediaStream?.removeTrack(stream.mediaStreamTrack);
});
localVideo.srcObject = mediaStream;
const pIdx = this.stageParticipants.findIndex((id) => id === participant.id);
this.stageParticipants.splice(pIdx, 1);
this.updateVideoCompositions();
});
1
this.stage.join();
Stage
object to give the broadcaster the ability to prevent viewers from joining until they are ready. When the host is ready, they click a button which updates this flag.1
2
3
4
5
6
7
8
9
10
11
12
async toggleRealtime(isLive) {
await fetch(`/api/stage/update`, {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
isLive,
chatArn: this.chatArn,
}),
});
this.isBroadcasting = isLive;
return;
}
Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.