Select your cookie preferences

We use essential cookies and similar tools that are necessary to provide our site and services. We use performance cookies to collect anonymous statistics, so we can understand how customers use our site and make improvements. Essential cookies cannot be deactivated, but you can choose “Customize” or “Decline” to decline performance cookies.

If you agree, AWS and approved third parties will also use cookies to provide useful site features, remember your preferences, and display relevant content, including relevant advertising. To accept or decline all non-essential cookies, choose “Accept” or “Decline.” To make more detailed choices, choose “Customize.”

AWS Logo
Menu
Live Streaming from Unreal Engine to Amazon IVS

Live Streaming from Unreal Engine to Amazon IVS

Explore the integration between Unreal Engine and Amazon Interactive Video Service (IVS), demonstrating how you can stream live content from Unreal Engine directly to Amazon IVS.

Tony Vu
Amazon Employee
Published Oct 22, 2024
Last Modified Mar 27, 2025
This blog post was co-authored by Daniel Gonzales, Senior Worldwide Partner Solutions Architect, and Patrick Palmer, Principal Partner Solutions Architect, both at AWS.
In this blog post, we will explore the integration between Unreal Engine and Amazon Interactive Video Service (IVS), demonstrating how you can stream live content from Unreal Engine directly to Amazon IVS.

What is Amazon IVS?

Amazon IVS is a fully managed live streaming service that empowers developers to build low-latency and real-time streaming applications. It leverages the same video technology that powers Twitch, enabling developers to tap into the same scalable and global infrastructure for delivering live streaming video to audiences worldwide. With Amazon IVS, developers can focus on creating engaging content and innovating their applications rather than managing the underlying video technology.

Why live stream from Unreal Engine?

Imagine being able to unleash Unreal Engine's powerful graphics and rendering capabilities beyond the confines of your desktop computer with some of the following use cases.
Use CaseDescription
Live Streaming of GameplayEngage the gamer’s community around their skills and experiences by enabling them to live stream their gameplay.
Real-time Sharing of Game Development ProcessesGame development companies can share behind-the-scenes game development processes in real-time, allowing their audience to witness it firsthand.
Exclusive Sneak Peeks of Upcoming TitlesGame studios can build excitement and anticipation among their fan base by offering exclusive sneak peaks of upcoming gaming titles
Live Streaming Unreal Engine Tutorials and Workshops Allows Unreal Engine experts to teach a mass audience how to use the software to create gaming or other content. This provides an interactive and engaging learning experience for the audience.
As you can see, there are a wide range of potential applications for live streaming Unreal Engine content. Whether you're a game developer, content creator, or a creative business looking to leverage the power of live streaming; this article will walk you through the process to get started with Unreal Engine and Amazon IVS. Let's dive in!

Prerequisites

Before we get started, let’s make sure we have the necessary tools and resources in place.
  1. Unreal Engine. You will need to have Unreal Engine version 5.4.3 or later installed.
  2. AWS Account. We will be leveraging the power of Amazon Web Services (AWS) to create the backend resources needed to live stream from Unreal Engine to Amazon IVS.
Once the prerequisites are in place, we follow the steps to harness live streaming:
  • Part 1 - Create an Unreal Editor project and Install Pixel Streaming Plugin
  • Part 2 - Create an IVS Stage on AWS
  • Part 3 - Write code to forward the media stream from Unreal Editor to IVS
  • Part 4 - Test it out

Solution Overview

At a high level, our solution will forward a media stream containing the contents of Unreal Engine to Amazon IVS for delivery to end viewers. We will install and use the Pixel Streaming Plugin in Unreal Engine which exposes a WebRTC media stream. We will stand up a WebRTC bridge server on our local machine to intercept the media stream. In order to intercept, our WebRTC bridge server will use a Go program to establish a connection to a locally running server that exposes a websocket endpoint of the Pixel Streaming Plugin. Once the media stream is intercepted, it will be forwarded to an Amazon IVS stage for delivery to integrated web and mobile apps using the Amazon IVS Broadcast SDK. With the Amazon IVS Broadcast SDK, developers can easily incorporate live streaming playback capabilities into their web and mobile apps across different platforms.
Image not found
Streaming from Unreal Engine to Amazon IVS architecture overview

Create an Unreal Editor project and Install Pixel Streaming Plugin

From your Unreal Editor, create a new project. In this example, we will use the first-person shooter template.
Image not found
Create a new Unreal Engine project
Starting a new project will take you to the main Unreal Editor window. From here, let’s install the Pixel Streaming Plug-in. The Pixel Streaming Plug-in allows you to live stream the content from Unreal Editor directly to web browsers. The plug-in captures the rendered frames from the Unreal Editor and converts it into a media stream that is sent over the internet using the WebRTC protocol. Leveraging WebRTC's real-time communication capabilities, the plugin creates a media stream that it distributes to web browsers for viewing via a signaling server. However, instead of using the signaling server, we will write some custom code to forward that stream directly to Amazon IVS.
From your Unreal Editor window, click Edit -> Plugins.
Image not found
Finding the plugins selection in the menu
In the pop up window, search for “Pixel Streaming” and select the Pixel Streaming plugin. You will then be prompted to restart UE
Image not found
Searching for the Pixel Streaming plugin
After restarting, you should see Pixel Streaming as a selectable item in your UE toolbar.
Image not found
Pixel Streaming in your Unreal Editor toolbar
Click on it and then select “Launch Signaling Server”.
Image not found
Launching the signaling server
This local server runs on your workstation, handling the initial connection and negotiation between the Unreal Engine application and the client's web browser. It also captures the rendered output from the Unreal Engine viewport, encodes it, and packages it as a media stream for transmission over WebRTC. We will write some additional code to intercept this media stream and forward it to Amazon IVS.
Next, click the Pixel Streaming Plugin in the toolbar again and select “Stream Level Editor”. This instructs the plugin to create a media stream solely from the gameplay viewport. You can also choose to stream the full level editor which will stream the gaming viewport and all the UI controls within the editor.
Image not found
Streaming the level editor
Finally, to start gameplay and create the media stream, click on the play button in the toolbar. This will create a media stream that we can send to IVS.
Image not found
Starting gameplay and creating the media stream

Create an Amazon IVS Stage

Now that we have the Pixel Streaming Plugin setup and a media stream being sent, we will need to create an Amazon IVS Stage. Forwarding the media stream from the Pixel Streaming Plugin to the stage is necessary for delivering it at scale to viewers. Up to 25K concurrent viewers will be able to join the Stage to watch the stream with latency under 300ms.
From the Amazon IVS console in your AWS account, select Amazon IVS Stage and click Get Started.
Image not found
Getting started with Amazon IVS
Give the stage a name like “unreal-engine-demo”. We could optionally record each individual participant in our live stream, but for now leave that off and click “Create Stage”.
Image not found
Creating an Amazon IVS stage
Once the stage has been created, it will present you with additional configuration information and a preview area to view your live stream directly in the console. Later on, we will be able to view our Unreal Engine gameplay here. Scroll down on the configuration page and click "Create token".
Image not found
Creating a participant token
Name your token “unreal-engine-demo” and click create token. We give our token both publish and subscribe capabilities which allows whoever has that token to both send video to the stage and receive video from the stage.
Image not found
Naming your participant token
Note, participant tokens can also be created programmatically using the AWS SDK. Now we will write the code to intercept the media stream created by the Pixel Streaming Plugin and forward it to the Amazon IVS stage we created.

Forwarding the media stream from Unreal Engine to a Stage

To forward the stream, we will create a small Go application that will run locally on our machine to forward the media stream from the Pixel Streaming Plug-in to the Amazon IVS stage we just created. The complete code is provided below for reference.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
//go:build !js
// +build !js

package main

import (
"bytes"
"encoding/json"
"flag"
"fmt"
"io/ioutil"
"net/http"
"time"

"github.com/pion/webrtc/v3"
"golang.org/x/net/websocket"
)

type websocketMessage struct {
Type string `json:"type"`
PeerConnectionOptions webrtc.Configuration `json:"peerConnectionOptions"`
Count int `json:"count"`
SDP string `json:"sdp"`
Candidate webrtc.ICECandidateInit `json:"candidate"`
IDs []string `json:"ids"`
StreamerId string `json:"streamerId"`
}

func main() {
url := flag.String("url", "ws://localhost/", "URL to UE5 Pixel Streaming WebSocket endpoint")
origin := flag.String("origin", "http://localhost", "Origin that is passed in HTTP header")
bearerToken := flag.String("token", "", "IVS Bearer Token")
flag.Parse()

if *bearerToken == "" {
panic("Bearer Token must not be empty")
}

conn, err := websocket.Dial(*url, "", *origin)
if err != nil {
panic(err)
}

defer func() {
if err = conn.Close(); err != nil {
panic(err)
}
}()

if err = websocket.JSON.Send(conn, websocketMessage{Type: "listStreamers"}); err != nil {
panic(err)
}

peerConnection := &webrtc.PeerConnection{}
peerConnectionConfig := webrtc.Configuration{}
data := []byte{}
jsonMessage := websocketMessage{}

for {
if err = websocket.Message.Receive(conn, &data); err != nil {
panic(err)
} else if err = json.Unmarshal(data, &jsonMessage); err != nil {
panic(err)
}

switch jsonMessage.Type {
case "config":
peerConnectionConfig = jsonMessage.PeerConnectionOptions

case "offer":
audioTrack, videoTrack := createStagesSession(*bearerToken)
peerConnection = createUnrealPeerConnection(conn, peerConnectionConfig, audioTrack, videoTrack)

if err = peerConnection.SetRemoteDescription(webrtc.SessionDescription{Type: webrtc.SDPTypeOffer, SDP: jsonMessage.SDP}); err != nil {
panic(err)
}

answer, err := peerConnection.CreateAnswer(nil)
if err != nil {
panic(err)
}

if err = peerConnection.SetLocalDescription(answer); err != nil {
panic(err)
}

if err = websocket.JSON.Send(conn, answer); err != nil {
panic(err)
}
case "iceCandidate":
if err = peerConnection.AddICECandidate(jsonMessage.Candidate); err != nil {
panic(err)
}
case "playerCount":
fmt.Println("Player Count", jsonMessage.Count)
case "streamerList":
if len(jsonMessage.IDs) >= 1 {
if err = websocket.JSON.Send(conn, websocketMessage{Type: "subscribe", StreamerId: jsonMessage.IDs[0]}); err != nil {
panic(err)
}
}
default:
fmt.Println("Unhandled type", jsonMessage.Type)
}
}
}

func createUnrealPeerConnection(conn *websocket.Conn, configuration webrtc.Configuration, stagesAudioTrack, stagesVideoTrack *webrtc.TrackLocalStaticRTP) *webrtc.PeerConnection {
m := &webrtc.MediaEngine{}

if err := m.RegisterCodec(webrtc.RTPCodecParameters{
RTPCodecCapability: webrtc.RTPCodecCapability{MimeType: webrtc.MimeTypeH264, ClockRate: 90000, Channels: 0, SDPFmtpLine: "level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e034", RTCPFeedback: nil},
PayloadType: 96,
}, webrtc.RTPCodecTypeVideo); err != nil {
panic(err)
} else if err := m.RegisterCodec(webrtc.RTPCodecParameters{
RTPCodecCapability: webrtc.RTPCodecCapability{MimeType: webrtc.MimeTypeOpus, ClockRate: 48000, Channels: 2, SDPFmtpLine: "minptime=10;useinbandfec=1", RTCPFeedback: nil},
PayloadType: 111,
}, webrtc.RTPCodecTypeAudio); err != nil {
panic(err)
}

peerConnection, err := webrtc.NewAPI(webrtc.WithMediaEngine(m)).NewPeerConnection(configuration)
if err != nil {
panic(err)
}

if _, err := peerConnection.AddTransceiverFromKind(webrtc.RTPCodecTypeAudio); err != nil {
panic(err)
} else if _, err := peerConnection.AddTransceiverFromKind(webrtc.RTPCodecTypeVideo); err != nil {
panic(err)
} else if _, err = peerConnection.CreateDataChannel("cirrus", nil); err != nil {
panic(err)
}

peerConnection.OnICEConnectionStateChange(func(connectionState webrtc.ICEConnectionState) {
fmt.Printf("Unreal Connection State has changed %s \n", connectionState.String())
})

peerConnection.OnICECandidate(func(c *webrtc.ICECandidate) {
if c == nil {
return
}
if err = websocket.JSON.Send(conn, &websocketMessage{Type: "iceCandidate", Candidate: c.ToJSON()}); err != nil {
panic(err)
}
})

peerConnection.OnTrack(func(t *webrtc.TrackRemote, _ *webrtc.RTPReceiver) {
fmt.Printf("Track has started, of type %d: %s \n", t.PayloadType(), t.Codec().RTPCodecCapability.MimeType)

buf := make([]byte, 1500)
for {
n, _, err := t.Read(buf)
if err != nil {
panic(err)
}

if t.Kind() == webrtc.RTPCodecTypeAudio {
if _, err := stagesAudioTrack.Write(buf[:n]); err != nil {
panic(err)
}
} else {
if _, err := stagesVideoTrack.Write(buf[:n]); err != nil {
panic(err)
}
}
}
})

go func() {
for range time.NewTicker(20 * time.Second).C {
if err = websocket.JSON.Send(conn, &websocketMessage{Type: "keepalive"}); err != nil {
panic(err)
}
}
}()

return peerConnection
}

func createStagesSession(bearerToken string) (*webrtc.TrackLocalStaticRTP, *webrtc.TrackLocalStaticRTP) {
addToken := func(req *http.Request) {
req.Header.Add("Authorization", "Bearer "+bearerToken)
}

peerConnection, err := webrtc.NewPeerConnection(webrtc.Configuration{})
if err != nil {
panic(err)
}

peerConnection.OnICEConnectionStateChange(func(connectionState webrtc.ICEConnectionState) {
fmt.Printf("Stages Connection State has changed %s \n", connectionState.String())
})

videoTrack, err := webrtc.NewTrackLocalStaticRTP(webrtc.RTPCodecCapability{MimeType: webrtc.MimeTypeH264}, "video", "pion")
if err != nil {
panic(err)
}

audioTrack, err := webrtc.NewTrackLocalStaticRTP(webrtc.RTPCodecCapability{MimeType: webrtc.MimeTypeOpus}, "audio", "pion")
if err != nil {
panic(err)
}

if _, err = peerConnection.AddTransceiverFromTrack(audioTrack, webrtc.RTPTransceiverInit{Direction: webrtc.RTPTransceiverDirectionSendonly}); err != nil {
panic(err)
} else if _, err = peerConnection.AddTransceiverFromTrack(videoTrack, webrtc.RTPTransceiverInit{Direction: webrtc.RTPTransceiverDirectionSendonly}); err != nil {
panic(err)
}

offer, err := peerConnection.CreateOffer(nil)
if err != nil {
panic(err)
}

if err := peerConnection.SetLocalDescription(offer); err != nil {
panic(err)
}

req, err := http.NewRequest("POST", "https://global.whip.live-video.net", bytes.NewBuffer([]byte(offer.SDP)))
if err != nil {
panic(err)
}

addToken(req)
req.Header.Add("Content-Type", "application/sdp")

client := &http.Client{
CheckRedirect: func(req *http.Request, via []*http.Request) error {
addToken(req)
return nil
},
}

resp, err := client.Do(req)
if err != nil {
panic(err)
}
defer resp.Body.Close()

if resp.StatusCode != http.StatusCreated {
panic(fmt.Sprintf("POST failed with error: %s", resp.Status))
}

body, err := ioutil.ReadAll(resp.Body)
if err != nil {
panic(err)
}

if err = peerConnection.SetRemoteDescription(webrtc.SessionDescription{Type: webrtc.SDPTypeAnswer, SDP: string(body)}); err != nil {
panic(err)
}

return audioTrack, videoTrack
}
The Go application runs on our local machine and connects to the WebSocket endpoint exposed by launching the signaling server from the Unreal Engine Pixel Streaming toolbar. It then establishes a WebRTC peer connection to stream audio and video data via the WHIP protocol to our stage. WHIP stands for WebRTC-HTTP Ingestion Protocol. Amazon IVS provides its own WHIP endpoint to enable this functionality. We will use the Pion WebRTC library to handle the WebRTC-related functionality required to set up a connection between Unreal Engine and Amazon IVS.
Let’s review this code piece by piece to understand what’s going on.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
//go:build !js
// +build !js

package main

import (
"bytes"
"encoding/json"
"flag"
"fmt"
"io/ioutil"
"net/http"
"time"

"github.com/pion/webrtc/v3"
"golang.org/x/net/websocket"
)
 In the first few lines, we specify that the program should not be compiled for JavaScript environments and declares itself as the main package. We import various standard Go libraries for tasks like JSON processing, command-line flag parsing, HTTP operations, and time management. We also import the Pion WebRTC library to enable us to handle the WebRTC related functionality. The Websocket package is also imported so we can establish a websocket connection with the signaling server.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
//go:build !js
// +build !js
package main

import (
"bytes"
"encoding/json"
"flag"
"fmt"
"io/ioutil"
"net/http"
"time"

"github.com/pion/webrtc/v3"
"golang.org/x/net/websocket"
)
Next, we define the structure for JSON messages we will receive when connecting to the Pixel Streaming endpoint. We will use this to manage the communication between our Go program and the Pixel Streaming endpoint
1
2
3
4
5
6
7
8
9
type websocketMessage struct {
Type string `json:"type"`
PeerConnectionOptions webrtc.Configuration `json:"peerConnectionOptions"`
Count int `json:"count"`
SDP string `json:"sdp"`
Candidate webrtc.ICECandidateInit `json:"candidate"`
IDs []string `json:"ids"`
StreamerId string `json:"streamerId"`
}
Next, we write the main logic of our program responsible for setting up a WebSocket connection to the Pixel Streaming signaling server and managing the WebRTC communication. Let’s break this down into a few different parts.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
func main() {
url := flag.String("url", "ws://localhost/", "URL to UE5 Pixel Streaming WebSocket endpoint")
origin := flag.String("origin", "http://localhost", "Origin that is passed in HTTP header")
bearerToken := flag.String("token", "", "IVS Bearer Token")
flag.Parse()

if *bearerToken == "" {
panic("Bearer Token must not be empty")
}

conn, err := websocket.Dial(*url, "", *origin)
if err != nil {
panic(err)
}

defer func() {
if err = conn.Close(); err != nil {
panic(err)
}
}()

if err = websocket.JSON.Send(conn, websocketMessage{Type: "listStreamers"}); err != nil {
panic(err)
}

peerConnection := &webrtc.PeerConnection{}
peerConnectionConfig := webrtc.Configuration{}
data := []byte{}
jsonMessage := websocketMessage{}

for {
if err = websocket.Message.Receive(conn, &data); err != nil {
panic(err)
} else if err = json.Unmarshal(data, &jsonMessage); err != nil {
panic(err)
}

switch jsonMessage.Type {
case "config":
peerConnectionConfig = jsonMessage.PeerConnectionOptions

case "offer":
audioTrack, videoTrack := createStagesSession(*bearerToken)
peerConnection = createUnrealPeerConnection(conn, peerConnectionConfig, audioTrack, videoTrack)

if err = peerConnection.SetRemoteDescription(webrtc.SessionDescription{Type: webrtc.SDPTypeOffer, SDP: jsonMessage.SDP}); err != nil {
panic(err)
}

answer, err := peerConnection.CreateAnswer(nil)
if err != nil {
panic(err)
}

if err = peerConnection.SetLocalDescription(answer); err != nil {
panic(err)
}

if err = websocket.JSON.Send(conn, answer); err != nil {
panic(err)
}
case "iceCandidate":
if err = peerConnection.AddICECandidate(jsonMessage.Candidate); err != nil {
panic(err)
}
case "playerCount":
fmt.Println("Player Count", jsonMessage.Count)
case "streamerList":
if len(jsonMessage.IDs) >= 1 {
if err = websocket.JSON.Send(conn, websocketMessage{Type: "subscribe", StreamerId: jsonMessage.IDs[0]}); err != nil {
panic(err)
}
}
default:
fmt.Println("Unhandled type", jsonMessage.Type)
}
}
}
  1. Command-line flag parsing: First we define command line flags for the URL we want to connect to, an origin URL for the request that is passed in the HTTP header, and a participant token. The URL we want to connect to will default to http://localhost since our signaling server is running locally and is listening on port 80 for websocket connections. The origin will be http://localhost as well since we will be running this Go program locally. Finally, we also specify a participant token to authenticate with the IVS stage we just created. This can be thought of as the password that enables us to send audio and video from Unreal Engine to IVS.
  2. Websocket connection: Next, we establish a websocket connection to the URL specified from above. We also set up a deferred function to close the connection when we exit the program.
  3. Initial communication: After connecting to the websocket URL, we set up a for loop that continually parses incoming JSON messages from the websocket connection. The different type of messages we handle include
    • config: Updates peer connection with configuration information
    • offer: Also known as a WebRTC offer, we create a session with our IVS stage (more on this later), set up a peer connection with the signaling server from the Pixel Streaming Plugin (more on this later), and send an answer. This offer and answer process allows the our Go program and the signaling server to negotiate and agree on the parameters of the connection.
    • iceCandidate: We use this to add ICE candidates to the peer connection. ICE stands for Interactive Connectivity Establishment. ICE candidates are network addresses that represent potential connection points for a device. Without ICE candidates, we wouldn’t be able to find the best connection point between our Go program and the signaling server. ICE candidates are also used to overcome network address translation (NAT) and firewall traversal issues that can prevent direct connections between peers. Here, both our Go program and our signaling server are running locally, so that won’t be an issue here.
    • playerCount: Prints the number of connected players or the number of participants that have joined the stage
    • streamerList: We use this JSON message to subscribe to the first available streamer. In this case our “streamer” is the media stream provided by the Pixel Streaming Plugin running in Unreal Engine.
Next, once we get a WebRTC offer from the websocket connection to the UE Pixel Streaming endpoint, we connect to the Amazon IVS stage we created earlier. We’ll call this function createStagesSession. A stage session just represents a period of activity when a participant joins a stage. In this case, our Go program is the participant joining the stage.
In this function, we set up a WebRTC connection for streaming audio and video to the IVS stage. After creating a peer connection, we add audio and video tracks, generate a WebRTC offer, and send it to the Amazon IVS WHIP endpoint. Using the participant token provided via the command line, we authenticate with the stage, create send-only transceivers for audio and video, and manage the WebRTC signaling process. It sends the offer to the IVS Stages server, receives an answer, and sets it as the remote description for the peer connection. The function returns the created audio and video tracks, which can then pass to the Unreal Pixel Streaming Plugin to write the media stream.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
func createStagesSession(bearerToken string) (*webrtc.TrackLocalStaticRTP, *webrtc.TrackLocalStaticRTP) {
addToken := func(req *http.Request) {
req.Header.Add("Authorization", "Bearer "+bearerToken)
}

peerConnection, err := webrtc.NewPeerConnection(webrtc.Configuration{})
if err != nil {
panic(err)
}

peerConnection.OnICEConnectionStateChange(func(connectionState webrtc.ICEConnectionState) {
fmt.Printf("Stages Connection State has changed %s \n", connectionState.String())
})

videoTrack, err := webrtc.NewTrackLocalStaticRTP(webrtc.RTPCodecCapability{MimeType: webrtc.MimeTypeH264}, "video", "pion")
if err != nil {
panic(err)
}

audioTrack, err := webrtc.NewTrackLocalStaticRTP(webrtc.RTPCodecCapability{MimeType: webrtc.MimeTypeOpus}, "audio", "pion")
if err != nil {
panic(err)
}

if _, err = peerConnection.AddTransceiverFromTrack(audioTrack, webrtc.RTPTransceiverInit{Direction: webrtc.RTPTransceiverDirectionSendonly}); err != nil {
panic(err)
} else if _, err = peerConnection.AddTransceiverFromTrack(videoTrack, webrtc.RTPTransceiverInit{Direction: webrtc.RTPTransceiverDirectionSendonly}); err != nil {
panic(err)
}

offer, err := peerConnection.CreateOffer(nil)
if err != nil {
panic(err)
}

if err := peerConnection.SetLocalDescription(offer); err != nil {
panic(err)
}

req, err := http.NewRequest("POST", "https://global.whip.live-video.net", bytes.NewBuffer([]byte(offer.SDP)))
if err != nil {
panic(err)
}

addToken(req)
req.Header.Add("Content-Type", "application/sdp")

client := &http.Client{
CheckRedirect: func(req *http.Request, via []*http.Request) error {
addToken(req)
return nil
},
}

resp, err := client.Do(req)
if err != nil {
panic(err)
}
defer resp.Body.Close()

if resp.StatusCode != http.StatusCreated {
panic(fmt.Sprintf("POST failed with error: %s", resp.Status))
}

body, err := ioutil.ReadAll(resp.Body)
if err != nil {
panic(err)
}

if err = peerConnection.SetRemoteDescription(webrtc.SessionDescription{Type: webrtc.SDPTypeAnswer, SDP: string(body)}); err != nil {
panic(err)
}

return audioTrack, videoTrack
}
Using the audio and video tracks created earlier, we can then set up a WebRTC peer connection to communicate with Unreal Engine. We will name this function createUnrealPeerConnection. This function configures the media engine with H.264 video and Opus audio codecs, creates a peer connection, adds audio/video transceivers, and sets up a data channel. The H.264 video codec and Opus audio codec are industry-standard codecs used for efficient video and audio compression, respectively. The audio/video transceivers are responsible for handling the encoding, decoding, and transmission of the audio and video streams using these codecs. These codecs and transceivers help ensure high-quality, low-latency live streaming from the Unreal Engine Pixel Streaming plugin to IVS. 
The function then establishes event handlers for ICE connection state changes, ICE candidates (sending them over WebSocket), and incoming media tracks. The incoming media tracks come from the Pixel Streaming Plugin. When tracks are received, it continuously reads from them and writes the data to the corresponding audio or video tracks we created earlier from our IVS stage. In this function, we essentially create a bridge between the Unreal Engine WebRTC media stream and the Amazon IVS stage to forward the audio/video.
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
func createUnrealPeerConnection(conn *websocket.Conn, configuration webrtc.Configuration, stagesAudioTrack, stagesVideoTrack *webrtc.TrackLocalStaticRTP) *webrtc.PeerConnection {
m := &webrtc.MediaEngine{}

if err := m.RegisterCodec(webrtc.RTPCodecParameters{
RTPCodecCapability: webrtc.RTPCodecCapability{MimeType: webrtc.MimeTypeH264, ClockRate: 90000, Channels: 0, SDPFmtpLine: "level-asymmetry-allowed=1;packetization-mode=1;profile-level-id=42e034", RTCPFeedback: nil},
PayloadType: 96,
}, webrtc.RTPCodecTypeVideo); err != nil {
panic(err)
} else if err := m.RegisterCodec(webrtc.RTPCodecParameters{
RTPCodecCapability: webrtc.RTPCodecCapability{MimeType: webrtc.MimeTypeOpus, ClockRate: 48000, Channels: 2, SDPFmtpLine: "minptime=10;useinbandfec=1", RTCPFeedback: nil},
PayloadType: 111,
}, webrtc.RTPCodecTypeAudio); err != nil {
panic(err)
}

peerConnection, err := webrtc.NewAPI(webrtc.WithMediaEngine(m)).NewPeerConnection(configuration)
if err != nil {
panic(err)
}

if _, err := peerConnection.AddTransceiverFromKind(webrtc.RTPCodecTypeAudio); err != nil {
panic(err)
} else if _, err := peerConnection.AddTransceiverFromKind(webrtc.RTPCodecTypeVideo); err != nil {
panic(err)
} else if _, err = peerConnection.CreateDataChannel("cirrus", nil); err != nil {
panic(err)
}

peerConnection.OnICEConnectionStateChange(func(connectionState webrtc.ICEConnectionState) {
fmt.Printf("Unreal Connection State has changed %s \n", connectionState.String())
})

peerConnection.OnICECandidate(func(c *webrtc.ICECandidate) {
if c == nil {
return
}
if err = websocket.JSON.Send(conn, &websocketMessage{Type: "iceCandidate", Candidate: c.ToJSON()}); err != nil {
panic(err)
}
})

peerConnection.OnTrack(func(t *webrtc.TrackRemote, _ *webrtc.RTPReceiver) {
fmt.Printf("Track has started, of type %d: %s \n", t.PayloadType(), t.Codec().RTPCodecCapability.MimeType)

buf := make([]byte, 1500)
for {
n, _, err := t.Read(buf)
if err != nil {
panic(err)
}

if t.Kind() == webrtc.RTPCodecTypeAudio {
if _, err := stagesAudioTrack.Write(buf[:n]); err != nil {
panic(err)
}
} else {
if _, err := stagesVideoTrack.Write(buf[:n]); err != nil {
panic(err)
}
}
}
})

go func() {
for range time.NewTicker(20 * time.Second).C {
if err = websocket.JSON.Send(conn, &websocketMessage{Type: "keepalive"}); err != nil {
panic(err)
}
}
}()

return peerConnection
}
With the code done, open your terminal and change to the directory with your code. Run the following to install the dependencies and start the Go program. Replace YOUR_PARTICIPANT_TOKEN_HERE with the participant token you created earlier in the IVS console
> go run ./main.go -token YOUR_PARTICIPANT_TOKEN_HERE
After running it, you should see output like the following
1
2
3
4
5
6
Stages Connection State has changed checking
Unreal Connection State has changed checking
Stages Connection State has changed connected
Unreal Connection State has changed connected
Track has started, of type 111: audio/opus
Track has started, of type 96: video/H264
To witness your Unreal Engine gameplay streaming in action, go back to the IVS console for your stage and generate an additional participant token (you can name it whatever you wish). This will allow us to join the stage from our web browser to view the live stream with the content from Unreal Engine. With the token in hand, navigate to the IVS Real-Time Streaming Web Sample. This publicly accessible web sample is a straightforward web application that integrates the Amazon IVS Broadcast SDK for Web, allowing it to display the video and playback the audio that was transmitted to the stage. Simply paste in the participant token and you should see the Unreal Engine feed directly in your browser.
Image not found
Unreal Engine live stream on the web
You can also see what the live stream looks like directly in the IVS console.
Image not found
Unreal Engine live stream in the IVS console
As you play the game from Unreal Engine, take note of the delay between registering an action and when you see it in your web browser. This real-time latency, with a delay under 300 milliseconds, is a key feature of IVS real-time streaming.

Conclusion

In this article, we delved into the integration between Unreal Engine and Amazon IVS, enabling live streaming of content from Unreal Engine to Amazon IVS. By harnessing Unreal Engine's advanced graphics and rendering capabilities with the scalable and real-time latency capabilities of Amazon IVS, developers can immerse their audiences into the game development process and expand the reach of gaming globally.
As you try out this integration, we encourage you to experiment and discover innovative ways to harness the combined strengths of Unreal Engine and Amazon IVS to elevate your live streaming and gaming experiences for your audience.

About the authors

Tony Vu is a Senior Partner Engineer at Twitch. He specializes in assessing partner technology for integration with Amazon Interactive Video Service (IVS), aiming to develop and deliver comprehensive joint solutions to our IVS customers. Tony enjoys writing and sharing content on LinkedIn.
Daniel Gonzales is a Senior Worldwide Partner Solutions Architect for M&E and Games at AWS. He has spent over twenty years leading technical transformations, from content creation and distribution to audience engagement, and continues working with customers to achieve their business objectives. 
Patrick Palmer is a Principal Partner Solutions Architect for Media and Entertainment at AWS. He has led in the entertainment 3D technology space for 25+ years, developing cutting-edge technologies and helping entertainment industry customers to navigate their greatest challenges.
 

Any opinions in this post are those of the individual author and may not reflect the opinions of AWS.

Comments

Log in to comment