|
26 | 26 | {"shape":"AccessDeniedException"},
|
27 | 27 | {"shape":"ResourceNotFoundException"}
|
28 | 28 | ],
|
29 |
| - "documentation":"<p> Join the ongoing one way-video and/or multi-way audio WebRTC session as a video producing device for an input channel. If there’s no existing session for the channel, a new streaming session needs to be created, and the Amazon Resource Name (ARN) of the signaling channel must be provided. </p> <p>Currently for the <code>SINGLE_MASTER</code> type, a video producing device is able to ingest both audio and video media into a stream, while viewers can only ingest audio. Both a video producing device and viewers can join the session first, and wait for other participants.</p> <p>While participants are having peer to peer conversations through webRTC, the ingested media session will be stored into the Kinesis Video Stream. Multiple viewers are able to playback real-time media.</p> <p>Customers can also use existing Kinesis Video Streams features like <code>HLS</code> or <code>DASH</code> playback, Image generation, and more with ingested WebRTC media.</p> <note> <p>Assume that only one video producing device client can be associated with a session for the channel. If more than one client joins the session of a specific channel as a video producing device, the most recent client request takes precedence. </p> </note>" |
| 29 | + "documentation":"<note> <p>Before using this API, you must call the <code>GetSignalingChannelEndpoint</code> API to request the WEBRTC endpoint. You then specify the endpoint and region in your <code>JoinStorageSession</code> API request.</p> </note> <p>Join the ongoing one way-video and/or multi-way audio WebRTC session as a video producing device for an input channel. If there’s no existing session for the channel, a new streaming session needs to be created, and the Amazon Resource Name (ARN) of the signaling channel must be provided. </p> <p>Currently for the <code>SINGLE_MASTER</code> type, a video producing device is able to ingest both audio and video media into a stream. Only video producing devices can join the session and record media.</p> <important> <p>Both audio and video tracks are currently required for WebRTC ingestion.</p> <p>Current requirements:</p> <ul> <li> <p>Video track: H.264</p> </li> <li> <p>Audio track: Opus</p> </li> </ul> </important> <p>The resulting ingested video in the Kinesis video stream will have the following parameters: H.264 video and AAC audio.</p> <p>Once a master participant has negotiated a connection through WebRTC, the ingested media session will be stored in the Kinesis video stream. Multiple viewers are then able to play back real-time media through our Playback APIs.</p> <p>You can also use existing Kinesis Video Streams features like <code>HLS</code> or <code>DASH</code> playback, image generation via <a href=\"https://docs.aws.amazon.com/kinesisvideostreams/latest/dg/gs-getImages.html\">GetImages</a>, and more with ingested WebRTC media.</p> <note> <p>S3 image delivery and notifications are not currently supported.</p> </note> <note> <p>Assume that only one video producing device client can be associated with a session for the channel. If more than one client joins the session of a specific channel as a video producing device, the most recent client request takes precedence. </p> </note> <p> <b>Additional information</b> </p> <ul> <li> <p> <b>Idempotent</b> - This API is not idempotent.</p> </li> <li> <p> <b>Retry behavior</b> - This is counted as a new API call.</p> </li> <li> <p> <b>Concurrent calls</b> - Concurrent calls are allowed. An offer is sent once per each call.</p> </li> </ul>" |
| 30 | + }, |
| 31 | + "JoinStorageSessionAsViewer":{ |
| 32 | + "name":"JoinStorageSessionAsViewer", |
| 33 | + "http":{ |
| 34 | + "method":"POST", |
| 35 | + "requestUri":"/joinStorageSessionAsViewer", |
| 36 | + "responseCode":200 |
| 37 | + }, |
| 38 | + "input":{"shape":"JoinStorageSessionAsViewerInput"}, |
| 39 | + "errors":[ |
| 40 | + {"shape":"ClientLimitExceededException"}, |
| 41 | + {"shape":"InvalidArgumentException"}, |
| 42 | + {"shape":"AccessDeniedException"}, |
| 43 | + {"shape":"ResourceNotFoundException"} |
| 44 | + ], |
| 45 | + "documentation":"<p> Join the ongoing one way-video and/or multi-way audio WebRTC session as a viewer for an input channel. If there’s no existing session for the channel, create a new streaming session and provide the Amazon Resource Name (ARN) of the signaling channel (<code>channelArn</code>) and client id (<code>clientId</code>). </p> <p>Currently for <code>SINGLE_MASTER</code> type, a video producing device is able to ingest both audio and video media into a stream, while viewers can only ingest audio. Both a video producing device and viewers can join a session first and wait for other participants. While participants are having peer to peer conversations through WebRTC, the ingested media session will be stored into the Kinesis Video Stream. Multiple viewers are able to playback real-time media. </p> <p>Customers can also use existing Kinesis Video Streams features like <code>HLS</code> or <code>DASH</code> playback, Image generation, and more with ingested WebRTC media. If there’s an existing session with the same <code>clientId</code> that's found in the join session request, the new request takes precedence.</p>" |
30 | 46 | }
|
31 | 47 | },
|
32 | 48 | "shapes":{
|
|
46 | 62 | "type":"string",
|
47 | 63 | "pattern":"^arn:(aws[a-zA-Z-]*):kinesisvideo:[a-z0-9-]+:[0-9]+:[a-z]+/[a-zA-Z0-9_.-]+/[0-9]+$"
|
48 | 64 | },
|
| 65 | + "ClientId":{ |
| 66 | + "type":"string", |
| 67 | + "max":256, |
| 68 | + "min":1, |
| 69 | + "pattern":"^[a-zA-Z0-9_.-]+$" |
| 70 | + }, |
49 | 71 | "ClientLimitExceededException":{
|
50 | 72 | "type":"structure",
|
51 | 73 | "members":{
|
|
70 | 92 | },
|
71 | 93 | "exception":true
|
72 | 94 | },
|
| 95 | + "JoinStorageSessionAsViewerInput":{ |
| 96 | + "type":"structure", |
| 97 | + "required":[ |
| 98 | + "channelArn", |
| 99 | + "clientId" |
| 100 | + ], |
| 101 | + "members":{ |
| 102 | + "channelArn":{ |
| 103 | + "shape":"ChannelArn", |
| 104 | + "documentation":"<p> The Amazon Resource Name (ARN) of the signaling channel. </p>" |
| 105 | + }, |
| 106 | + "clientId":{ |
| 107 | + "shape":"ClientId", |
| 108 | + "documentation":"<p> The unique identifier for the sender client. </p>" |
| 109 | + } |
| 110 | + } |
| 111 | + }, |
73 | 112 | "JoinStorageSessionInput":{
|
74 | 113 | "type":"structure",
|
75 | 114 | "required":["channelArn"],
|
|
94 | 133 | },
|
95 | 134 | "String":{"type":"string"}
|
96 | 135 | },
|
97 |
| - "documentation":"<p> </p>" |
| 136 | + "documentation":"<p><fullname>webrtc</fullname> <p> </p></p>" |
98 | 137 | }
|
0 commit comments