Meishe Streaming SDK
Meishe SDK consists of two parts: EffectSDK and StreamingSDK. Meishe SDK is based on WebAssembly functions. StreamingSDK includes various functions such as video shooting, editing, and special effects rendering, providing a complete set of video editing capabilities.
This document explains the basic usage of the StreamingSDK. You can edit and preview your code online in real time through the built-in StackBlitz link.
1. Basic processes and operations
1.1. Import SDK
You can easily introduce the SDK through CDN. CDN link:
https://alieasset.meishesdk.com/NvWasm/domain/3-14-2-release/9/NvStreamingSdk.js
This connection is for testing purposes. For official customer use, please contact the business department to obtain the SDK file. Do not use it in the customer environment.
Assuming you are using a SPA application, you need to include the SDK via CDN in your index.html:
index.html:
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<script src="https://alieasset.meishesdk.com/NvWasm/domain/3-14-2-release/9/NvStreamingSdk.js"></script>
</head>
</html>
1.2. Install the SDK loader
You need to use the meishewasmloader provided by Meishe to load the SDK.
npm download:
npm install meishewasmloader
pnpm download:
pnpm install meishewasmloader
yarn download:
yarn add meishewasmloader
1.3. Environment Setup
The SDK depends on SharedArrayBuffer. Due to browser security restrictions, you need to enable COOP and COEP in your environment. You can make the following changes in your project:
vite.config.ts:
...
server: {
headers: {
'Cross-Origin-Opener-Policy': 'same-origin',
'Cross-Origin-Embedder-Policy': 'require-corp',
},
},
...
1.4. Initialize the SDK
You can initialize the SDK in the following ways:
import { WASMLoader } from 'meishewasmloader';
const wasmLoader = WASMLoader({
// Loading progress callback
showLoader: function (state) {},
// Failure callback
showError(errorText: string) {},
// Success callback
loadingFinished() {},
});
// Start calling function loading
wasmLoader.loadEmscriptenModule("https://alieasset.meishesdk.com/NvWasm/domain/3-14-2-release/9/");
1.5. Authorization and authentication (can be ignored)
Videos generated by the SDK that have not passed the authorization verification will have a default watermark, which will be removed only after passing the authorization verification. The authorization file can be obtained by contacting the business department on the Meishe official website. The SDK has a built-in authorization verification interface, and the method is as follows:
// nvsGetStreamingContextInstance is a global method provided after SDK initialization, which can be understood as an instance of StreamingSDK
const nvStreamingContext = nvsGetStreamingContextInstance();
// Authorization verification callback function
nvStreamingContext.onWebRequestAuthFinish = (success) => {
if (!success) {
console.error('SDK authentication failed');
} else {
console.error('SDK authentication successful');
}
};
nvStreamingContext.verifySdkLicenseFile('Authentication address')
1.6. Get StreamingContextInstance (SDK engine instance)
After the SDK is initialized successfully, many SDK built-in methods will be mounted in the window. You can directly use the following methods to obtain the SDK engine instance:
const nvStreamingContext = nvsGetStreamingContextInstance();
在后续的开发中,大多数功能都要依赖nvStreamingContext实现。
1.7. Creating a LiveWindow
Create a canvas tag with the id "live-window" in your project.
The code is as follows:
<canvas id="live-window" />
Also record the width and height of the canvas:
let canvas = document.getElementById('live-window') as HTMLCanvasElement;
// devicePixelRatio is the device pixel resolution ratio
let width = canvas.clientWidth * window.devicePixelRatio;
let height = canvas.clientHeight * window.devicePixelRatio;
// LiveWindow has restrictions on the width and height of the container. The width must be a multiple of 4 and the height must be a multiple of 2.
width = canvas.width = width - (width % 4);
height = canvas.height = height - (height % 2);
See window.devicePixelRatio for details.
LiveWindow - streaming media window, which can be simply understood as an Editor player. It is a display window for real-time editing effects.
Create a LiveWindow:
const liveWindow = nvStreamingContext.createLiveWindow('live-window')
1.8. Creating a Timeline
The timeline is one of the main contents of the Editor. Tracks, clips, resources, materials, subtitles, special effects, etc. all rely on the timeline. Adding them to the timeline means that the resources have been used.
Timelines take up relatively little resources. If necessary, you can create multiple timelines in one program. Generally, you only need to create one timeline.
To create a timeline:
// NvsCreateTimelineFlagEnum, NvsVideoResolution, NvsRational, and NvsAudioResolution are all global classes injected after SDK initialization and do not need to be introduced.
// Timeline created logo
const TIMELINE_FLAGS =
NvsCreateTimelineFlagEnum.DontAddDefaultVideoTransition +
NvsCreateTimelineFlagEnum.ForceAudioSampleFormatUsed +
NvsCreateTimelineFlagEnum.RecordingUserOperation +
NvsCreateTimelineFlagEnum.SyncAudioVideoTrasitionInVideoTrack;
const timeline = nvStreamingContext.createTimeline(
new NvsVideoResolution(width,height),
new NvsRational(25, 1),
new NvsAudioResolution(44100, 2),
TIMELINE_FLAGS
);
1.9. Connecting Timeline and LiveWindow
Unconnected Timeline and LiveWindow are separate modules. Only when they are connected can LiveWindow display the images controlled by Timeline.
The connection method is as follows:
nvStreamingContext.connectTimelineWithLiveWindow(
timeline,
liveWindow
);
1.10. Add Track
Tracks are containers for clips. A timeline can contain multiple tracks. Tracks in different orders reflect different overlay relationships. Tracks placed earlier in the order have greater weight and a higher hierarchy. To add audio and video, you must first create a track.
Tracks are divided into audio tracks and video tracks. The creation method is as follows:
const videoTrack = timeline.appendVideoTrack();
const audioTrack = timeline.appendAudioTrack();
1.11. Adding clips to a track
A material exists in the form of a clip in a track. A track can hold multiple materials. Each material needs to be converted into m3u8 format through Meishe transcoding service before it can be added to the track.
Before going on track, you need to use the FS interface provided by the SDK to download/install it into the memory.
The operation is as follows:
// The transcoded resource address
let videoResourceUrl =
'https://alieasset.meishesdk.com/editor/2022/07/05/video/afd62303-3492-4c31-b09c-1c56c63b46a2/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8';
let audioResourceUrl =
'https://alieasset.meishesdk.com/test/2024/05/24/audio/6dc60190-c22b-4740-a299-3981d1a8c7ec/6dc60190-c22b-4740-a299-3981d1a8c7ec.m3u8';
const videoResponse = await fetch(videoResourceUrl);
const videoText = await videoResponse.text();
const videoPath = `/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8`
// FS is a global object and does not need to be introduced
FS.writeFile(videoPath, videoText)
const audioResponse = await fetch(audioResourceUrl);
const audioText = await audioResponse.text();
const audioPath = `/6dc60190-c22b-4740-a299-3981d1a8c7ec.m3u8`
FS.writeFile(audioPath, audioText)
const videoClip = videoTrack.appendClip(videoPath);
const audioClip = audioTrack.appendClip(audioPath);
1.12. Play/Pause
The SDK provides various callback functions, among which onStreamingEngineStateChanged is used to monitor changes in the SDK playback status.
// NvsStreamingEngineStateEnum does not need to be introduced
const playState = false
nvStreamingContext.onStreamingEngineStateChanged = (state) => {
switch (state) {
case NvsStreamingEngineStateEnum.StreamingEngineStatePlayback:// Play
playState(true);
break;
case NvsStreamingEngineStateEnum.StreamingEngineStateStopped:// Pause
case NvsStreamingEngineStateEnum.StreamingEngineStateSeeking:// It can be understood as time switching
playState(false);
break;
default:
playState(false);
break;
}
};
After getting the current playback status, you can perform play/pause according to different states:
let flags = 0;
if (playState) {
// Although you can use nvStreamingContext.streamingEngineReadyForTimelineModification to achieve a similar pause effect, it is intuitively recommended to use the stop method.
nvStreamingContext.stop();
} else {
nvStreamingContext.playbackTimeline(
timeline, // Current timeline object
0, // Start time, in microseconds
-1, // End time, in microseconds, -1 means playing to the end
NvsVideoPreviewSizeModeEnum.LiveWindowSize, // Video size mode
true, // Whether to preload
(flags |= NvsPlaybackFlagEnum.BuddyHostOriginVideoFrame) // Play Marker
);
}
At this point, you have completed a simple editor that includes video, audio, and supports play and pause.
You can click StackBlitz to edit and preview it in real time in the online environment.
2. Add multiple tracks
Multiple tracks can be added to the timeline.
Multi-track can achieve split-screen, multi-screen and other effects through playback area adjustment. Multi-level, three-dimensional and other effects can be achieved through masking characteristics and transparency, blur, keyframes, etc. It is a very practical function.
2.1. Readiness
Before modifying the timeline and liveWindow, you need to change the nvStreamingContext to the update preparation completion state:
await nvStreamingContext.streamingEngineReadyForTimelineModification();
2.2.Create New Track
const videoTrack1 = timeline.appendVideoTrack();
const videoTrack2 = timeline.appendVideoTrack();
const audioTrack1 = tiemline.appendAudioTrack();
const audioTrack2 = tiemline.appendAudioTrack();
2.3.Add material
// addResource is a wrapper for installing resource methods. You can refer to the above 'Adding Clips to Tracks' to wrap it yourself, or copy the online code in StackBlitz
let path = await addResource(videoResourceUrl);
let audioPath = await addResource(audioResourceUrl)
const videoClip1 = videoTrack1.appendClip2(path, 0, 185000000);
const videoClip2 = videoTrack2.appendClip2(path, 0, 185000000);
const audioClip1 = audioTrack1.appendClip2(audioPath,0,185000000)
const audioClip2 = audioTrack1.appendClip2(audioPath,0,185000000)
At this point, multitracking is complete. However, since the two tracks completely overlap and use the same resources, the footage in the newly added track will completely cover the footage in the old track. So, let's set up some special effects to distinguish the original track.
2.4.Add special effects
Add special effects to distinguish the original track image.
You need to turn on special effects first:
videoClip.enablePropertyVideoFx(true);
Effects are divided into three categories: string, boolean, and float. They are set via setStringVal, setBooleanVal, and setFloatVal, respectively. For specific parameters, see the Built-in Effects Table.
First get the fragment special effects instance:
const propertyFx = videoClip.getPropertyVideoFx();
Add built-in effects:
Built-in special effects are special effects built into the SDK and natively supported, and can be implemented without introducing special effects packages.
propertyFx.setFloatVal('Scale X', 0.5);//X-axis scaling
propertyFx.setFloatVal('Scale Y', 0.5);//Y-axis scaling
propertyFx.setFloatVal('Trans X', -100);//X-axis displacement
propertyFx.setFloatVal('Trans Y', 100);//Y-axis displacement
propertyFx.setFloatVal('Rotation', 30);//Clockwise rotation
propertyFx.setFloatVal('Opacity', 0.9);//transparency
Add entry and exit animations:
Entry and exit animations are not built-in special effects and require the installation of a special effects package to be implemented.
const inAnimationPackageUrl = 'https://qasset.meishesdk.com/material/pu/videofx/0A2158E2-A290-4CFB-B1FD-868A96ED9E8B/0A2158E2-A290-4CFB-B1FD-868A96ED9E8B.8.videofx';
const inAnimationUuid = await installAsset(inAnimationPackageUrl);
// Add animation, or combine animations
propertyFx.setStringVal('Package Id', inAnimationUuid as string);
// Enter the animation start and end time, calculated according to the timestamp in the video clip
propertyFx.setFloatVal('Package Effect In', 0);// The start time of the animation, in microseconds
propertyFx.setFloatVal('Package Effect Out', 2000000);// End time of the animation, in microseconds
Complete code:
await nvStreamingContext.streamingEngineReadyForTimelineModification();
const videoTrack = timeline.appendVideoTrack();
let path = await addResource(videoResourceUrl);
const videoClip = videoTrack.appendClip2(path, 0, 185000000);
if (videoClip) {
// Adjust videos by zooming, panning, rotating, and animating
videoClip.enablePropertyVideoFx(true);
const propertyFx = videoClip.getPropertyVideoFx();
if (propertyFx) {
propertyFx.setFloatVal('Scale X', 0.5);
propertyFx.setFloatVal('Scale Y', 0.5);
propertyFx.setFloatVal('Trans X', -100);
propertyFx.setFloatVal('Trans Y', 100);
propertyFx.setFloatVal('Rotation', 30);
propertyFx.setFloatVal('Opacity', 0.9);
const inAnimationPackageUrl =
'https://qasset.meishesdk.com/material/pu/videofx/0A2158E2-A290-4CFB-B1FD-868A96ED9E8B/0A2158E2-A290-4CFB-B1FD-868A96ED9E8B.8.videofx';
const inAnimationUuid = await installAsset(inAnimationPackageUrl);
// Add animation, or combine animations
propertyFx.setStringVal('Package Id', inAnimationUuid as string);
// Enter the animation start and end time, calculated according to the timestamp in the video clip
propertyFx.setFloatVal('Package Effect In', 0);// The start time of the animation, in microseconds
propertyFx.setFloatVal('Package Effect Out', 2000000);// End time of the animation, in microseconds
}
}
// Move the current time to 0 seconds. It is recommended to record currentTime yourself.
nvStreamingContext.seekTimeline(
timeline,
0,
NvsVideoPreviewSizeModeEnum.LiveWindowSize,
NvsSeekFlagEnum.BuddyHostVideoFrame
);
You can edit and preview it in real time in the online environment by clicking StackBlitz.
3. Adding a Transition
Transitions are divided into two categories: built-in transitions and material package transitions.
3.1. Built-in transitions
Built-in transitions are transition effects built into the SDK, including
- Fade
- Turning
- Swap
- Stretch In
- Page Curl
- Lens Flare
- Star
- Dip To Black
- Dip To White
- Push To Right
- Push To Top
- Upper Left Into
Built-in transitions do not need to be downloaded and can be used directly.
Usage is as follows:
// The first parameter is the clip index in the track, starting from 0, and the second parameter is the name of the built-in special effect
videoTrack.setBuiltinTransition(0, 'Turning');
3.2.Material package transition
In addition to built-in transitions, you can also add transitions to your project by loading a resource pack. Compared to built-in transitions, using a resource pack offers more options. You can create different transition effects based on different video styles (you can contact Meishe Business on the Meishe official website to obtain a resource pack).
The material package transition requires a material address, similar to this:
const packageUrl = 'https://qasset.meishesdk.com/material/pu/transition/02D05082-E3C3-498D-AAB2-15DC62AB2018/02D05082-E3C3-498D-AAB2-15DC62AB2018.1.videotransition'
After obtaining the address, you need to download the material package and install it into the memory through the package manager. The code is as follows:
async function installAsset(packageUrl: string) {
let res = await fetch(packageUrl);
const packageInFS = '/' + packageUrl.split('/').pop();
await FS.writeFile(packageInFS, new Uint8Array(await res.arrayBuffer()));
let assetType = NvsAssetPackageTypeEnum.VideoTransition;
return new Promise((resolve, reject) => {
if (assetType === undefined) {
reject(assetUuid || '');
return;
}
// Check the status of asset package first. If it has been installed, don't need install it again.
const status = nvsGetStreamingContextInstance()
.getAssetPackageManager()
.getAssetPackageStatus(assetUuid || '', assetType);
if (status !== NvsAssetPackageStatusEnum.NotInstalled) {
resolve(assetUuid || '');
return;
}
// This callback function means installation finished
nvsGetStreamingContextInstance().getAssetPackageManager().onFinishAssetPackageInstallation =
(assetPackageId, assetPackageFilePath, assetPackageType, error) => {
FS.unlink(assetPackageFilePath, 0);
// error is 0 means success
if (error === 0 && assetPackageId === assetUuid) {
resolve(assetUuid || '');
} else {
reject(assetUuid || '');
}
};
nvsGetStreamingContextInstance()
.getAssetPackageManager()
.installAssetPackage(packageInFS, '', assetType);
});
}
You can also click the StackBlitz link below to view the corresponding code in the utils/util.ts file of the online environment.
After the package is installed, use the material package in the following ways:
const assetUuid = await installAsset(packageUrl);
// The first parameter is the clip index in the track, starting from 0, and the second parameter is the name of the built-in special effect
videoTrack.setPackagedTransition(1, assetUuid as string);
The complete code is as follows:
await nvStreamingContext.streamingEngineReadyForTimelineModification();
let path = await addResource(videoResourceUrl);
videoTrack.addClip2(
path,
0,
30000000,
38000000
);
// Adding built-in transitions
videoTrack.setBuiltinTransition(0, 'Turning');
// Adding Package Transitions
const packageUrl =
'https://qasset.meishesdk.com/material/pu/transition/02D05082-E3C3-498D-AAB2-15DC62AB2018/02D05082-E3C3-498D-AAB2-15DC62AB2018.1.videotransition';
// installAsset is the package installation method. You can click the StackBlitz link below to view the code in utils/util.ts in the online environment.
const assetUuid = await installAsset(packageUrl);
videoTrack.setPackagedTransition(1, assetUuid as string);
nvStreamingContext.seekTimeline(
timeline,
0,
NvsVideoPreviewSizeModeEnum.LiveWindowSize,
NvsSeekFlagEnum.BuddyHostVideoFrame
);
You can click StackBlitz to edit and preview it in real time in the online environment.
4. Add timeline effects
Track effects were introduced in the multi-track example above. Timeline effects are similar, but their impact is on the entire timeline.
Timeline effects are also divided into two types: built-in effects and package effects.
You can click Built-in effects to view the complete list of effects.
4.1. The usage of the built-in special effects of the timeline is as follows:
const timelineVideoFx = timeline.addBuiltinTimelineVideoFx(
0,
5000000,
'Gaussian Blur'
);
4.2. Special effects included
The special effects package address also needs to be provided:
const packageUrl = 'https://qasset.meishesdk.com/material/pu/videofx/8EA07793-A3BB-4719-9882-3534E7D60618/8EA07793-A3BB-4719-9882-3534E7D60618.videofx';
Like asset pack transitions, special effects packs also need to be downloaded and installed, and the method is the same as for asset packs. If you directly copy the installAsset method in StackBlitz, you can use it directly without modification. If you copy the example in the transition, you need to modify the assetType:
async function installAsset(packageUrl: string) {
...
let assetType = NvsAssetPackageTypeEnum.VideoFx;
...
}
After the package is installed, use the material package in the following ways:
const assetUuid = await installAsset(packageUrl);
timeline.addPackagedTimelineVideoFx(
0,
5000000,
assetUuid as string
);
The complete code is as follows:
await nvStreamingContext.streamingEngineReadyForTimelineModification();
// Adding built-in transitions
const timelineVideoFx = timeline.addBuiltinTimelineVideoFx(
0,
5000000,
'Gaussian Blur'
);
timelineVideoFx.setFloatVal('Radius', 20);
// Adding Package Transitions
const packageUrl =
'https://qasset.meishesdk.com/material/pu/videofx/8EA07793-A3BB-4719-9882-3534E7D60618/8EA07793-A3BB-4719-9882-3534E7D60618.videofx';
const assetUuid = await installAsset(packageUrl);
timeline.addPackagedTimelineVideoFx(
0,
5000000,
assetUuid as string
);
nvStreamingContext.seekTimeline(
timeline,
0,
NvsVideoPreviewSizeModeEnum.LiveWindowSize,
NvsSeekFlagEnum.BuddyHostVideoFrame
);
You can click StackBlitz to edit and preview it in real time in the online environment.
5. Add subtitles
You can add subtitles in the following ways:
const caption = timeline.addCaption(
'Hello',
0,
5000000,
'',
false
);
5.1.Add subtitle styles
If you want your subtitles to have a cool effect, you can use subtitle styles.
Subtitle style requires material package:
const packageUrl = 'https://qasset.meishesdk.com/material/captionstyle/E30D10CF-6693-4BDD-BE66-418F86BB1578.5.captionstyle';
Download and install:
async function installAsset(packageUrl: string) {
...
let assetType = NvsAssetPackageTypeEnum.CaptionStyle;
...
}
Use the subtitle style package:
const assetUuid = await installAsset(packageUrl);
const caption = timeline.addCaption(
'Hello',
0,
5000000,
assetUuid as string,
false
);
5.2. Subtitle deformation
After adding subtitles, you can scale, rotate, and move them:
caption.setCaptionTranslation(new NvsPointF(100, 100));
caption.scaleCaption2(2);
caption.rotateCaption2(45);
5.3. Font
Subtitles can set fonts:
const fontUrl = 'https://alieasset.meishesdk.com/font/站酷酷黑体.ttf';
const response = await fetch(fontUrl);
const fontInFS = '/' + fontUrl.split('/').pop();
await FS.writeFile(fontInFS, new Uint8Array(await response.arrayBuffer()));
caption.setFontByFilePath(fontInFS);
The complete code is as follows:
await nvStreamingContext.streamingEngineReadyForTimelineModification();
// Add subtitles in subtitle style. When the text is in Chinese, you need to set the Chinese font.
const packageUrl =
'https://qasset.meishesdk.com/material/captionstyle/E30D10CF-6693-4BDD-BE66-418F86BB1578.5.captionstyle';
const assetUuid = await installAsset(packageUrl);
const caption = timeline.addCaption(
'你好',
0,
5000000,
assetUuid as string,
false
);
const fontUrl = 'https://alieasset.meishesdk.com/font/站酷酷黑体.ttf';
const response = await fetch(fontUrl);
const fontInFS = '/' + fontUrl.split('/').pop();
await FS.writeFile(fontInFS, new Uint8Array(await response.arrayBuffer()));
// Adjust subtitles by font settings, scaling, translation, and rotation
caption.setFontByFilePath(fontInFS);
caption.setCaptionTranslation(new NvsPointF(100, 100));
caption.scaleCaption2(2);
caption.rotateCaption2(45);
nvStreamingContext.seekTimeline(
timeline,
0,
NvsVideoPreviewSizeModeEnum.LiveWindowSize,
NvsSeekFlagEnum.BuddyHostVideoFrame
);
You can click StackBlitz to edit and preview it in real time in the online environment.
6. Module subtitles
You can add module subtitles in the following ways:
const caption = timeline.addModularCaption(
'Flower Character',
0,
5000000
);
6.1. Add subtitle styles
Requires module subtitle package:
const packageUrl = 'https://qasset.meishesdk.com/material/captionstyle/48734DC5-6E58-46A9-9F48-E18CF1E25A1F.3.captionrenderer';
Download and install:
async function installAsset(packageUrl: string) {
...
let assetType = NvsAssetPackageTypeEnum.CaptionRenderer;
...
}
use:
const assetUuid = await installAsset(packageUrl);
const caption = timeline.addModularCaption(
'Flower Character',
0,
5000000
);
caption.applyModularCaptionRenderer(assetUuid as string);
6.2. Subtitle deformation
After adding subtitles, you can scale, rotate, and move them:
caption.setCaptionTranslation(new NvsPointF(100, 100));
caption.scaleCaption2(2);
caption.rotateCaption2(45);
6.3. Font
Module subtitles can also specify fonts:
const fontUrl = 'https://alieasset.meishesdk.com/font/站酷酷黑体.ttf';
const response = await fetch(fontUrl);
const fontInFS = '/' + fontUrl.split('/').pop();
await FS.writeFile(fontInFS, new Uint8Array(await response.arrayBuffer()));
caption.setFontByFilePath(fontInFS);
The complete code is as follows:
await nvStreamingContext.streamingEngineReadyForTimelineModification();
const packageUrl =
'https://qasset.meishesdk.com/material/captionstyle/48734DC5-6E58-46A9-9F48-E18CF1E25A1F.3.captionrenderer';
const assetUuid = await installAsset(packageUrl);
const caption = timeline.addModularCaption(
'Flower Character',
0,
5000000
);
caption.applyModularCaptionRenderer(assetUuid as string);
const fontUrl = 'https://alieasset.meishesdk.com/font/站酷酷黑体.ttf';
const response = await fetch(fontUrl);
const fontInFS = '/' + fontUrl.split('/').pop();
await FS.writeFile(fontInFS, new Uint8Array(await response.arrayBuffer()));
caption.setFontByFilePath(fontInFS);
caption.setFontSize(100);
nvStreamingContext.seekTimeline(
timeline,
0,
NvsVideoPreviewSizeModeEnum.LiveWindowSize,
NvsSeekFlagEnum.BuddyHostVideoFrame
);
You can click StackBlitz to edit and preview it in real time in the online environment.
7. Stickers
The SDK does not have built-in stickers, so sticker packs must be imported before they can be used.
Sticker pack:
const packageUrl = 'https://qasset.meishesdk.com/material/pu/animatedsticker/A1509C3D-7F5C-43CB-96EE-639ED7616BB7/A1509C3D-7F5C-43CB-96EE-639ED7616BB7.1.animatedsticker';
Download and install:
async function installAsset(packageUrl: string) {
...
let assetType = NvsAssetPackageTypeEnum.AnimatedSticker;
...
}
You can add stickers in the following ways:
const assetUuid = await installAsset(packageUrl);
const sticker = timeline.addAnimatedSticker(
0,
5000000,
assetUuid as string,
false,
false,
''
);
7.1. Sticker variant
After adding subtitles, you can scale, rotate, and move them:
sticker.setTranslation(new NvsPointF(-100, 100));
sticker.scaleAnimatedSticker2(0.8);
sticker.rotateAnimatedSticker2(-30);
7.2. Keyframe
Stickers can set keyframes to achieve animation and transition effects:
sticker.setCurrentKeyFrameTime(0);
sticker.setTranslation(new NvsPointF(-200, -100));
sticker.setCurrentKeyFrameTime(4000000);
sticker.setTranslation(new NvsPointF(0, 0));
The complete code is as follows:
await nvStreamingContext.streamingEngineReadyForTimelineModification();
// Add animated sticker
const packageUrl = 'https://qasset.meishesdk.com/material/pu/animatedsticker/A1509C3D-7F5C-43CB-96EE-639ED7616BB7/A1509C3D-7F5C-43CB-96EE-639ED7616BB7.1.animatedsticker';
const assetUuid = await installAsset(packageUrl);
const sticker = timeline.addAnimatedSticker(
0,
5000000,
assetUuid as string,
false,
false,
''
);
// Adjust stickers by scaling, translating, and rotating them
sticker.setTranslation(new NvsPointF(-100, 100));
sticker.scaleAnimatedSticker2(0.8);
sticker.rotateAnimatedSticker2(-30);
// Set sticker translation keyframes
sticker.setCurrentKeyFrameTime(0);
sticker.setTranslation(new NvsPointF(-200, -100));
sticker.setCurrentKeyFrameTime(4000000);
sticker.setTranslation(new NvsPointF(0, 0));
nvStreamingContext.seekTimeline(
timeline,
0,
NvsVideoPreviewSizeModeEnum.LiveWindowSize,
NvsSeekFlagEnum.BuddyHostVideoFrame
);
You can click StackBlitz to edit and preview it in real time in the online environment.
8. Timeline fragment
You can add a timeline clip (sub-timeline) in the following ways:
8.1. Create a sub-timeline
const clipTimeline = nvStreamingContext.createTimeline(
new NvsVideoResolution(960, 540),
new NvsRational(25, 1),
new NvsAudioResolution(44100, 2)
);
8.2. Add tracks and clips
Add tracks, clips, and effects to the sub-timeline.
const videoTrack = clipTimeline.appendVideoTrack();
let path = await addResource(videoResourceUrl);
let clip = videoTrack.addClip2(path, 0, 1000000, 60000000);
clip.enablePropertyVideoFx(true);
const propertyFx = clip.getPropertyVideoFx();
if (propertyFx) {
propertyFx.setFloatVal('Scale X', 0.8);
propertyFx.setFloatVal('Scale Y', 0.8);
propertyFx.setFloatVal('Trans X', -100);
propertyFx.setFloatVal('Trans Y', 100);
propertyFx.setFloatVal('Opacity', 0.7);
}
8.3. Connect to the parent timeline
The above process only creates a new timeline. To nest timelines, you need to call a specific API:
// defaultVideoTrack is the video track of the original timeline
defaultVideoTrack.addTimelineClip(clipTimeline, 0);
完整代码如下:
await nvStreamingContext.streamingEngineReadyForTimelineModification();
// CreateTimeline
const clipTimeline = nvStreamingContext.createTimeline(
new NvsVideoResolution(960, 540),
new NvsRational(25, 1),
new NvsAudioResolution(44100, 2)
);
const videoTrack = clipTimeline.appendVideoTrack();
let path = await addResource(videoResourceUrl);
let clip = videoTrack.addClip2(path, 0, 1000000, 60000000);
clip.enablePropertyVideoFx(true);
const propertyFx = clip.getPropertyVideoFx();
if (propertyFx) {
propertyFx.setFloatVal('Scale X', 0.8);
propertyFx.setFloatVal('Scale Y', 0.8);
propertyFx.setFloatVal('Trans X', -100);
propertyFx.setFloatVal('Trans Y', 100);
propertyFx.setFloatVal('Opacity', 0.7);
}
// defaultVideoTrack is the video track of the original timeline
defaultVideoTrack.addTimelineClip(clipTimeline, 0);
nvStreamingContext.seekTimeline(
timeline,
0,
NvsVideoPreviewSizeModeEnum.LiveWindowSize,
NvsSeekFlagEnum.BuddyHostVideoFrame
);
You can edit and preview it in real time in the online environment by clicking StackBlitz.
9: Special Effects
You can click Special Effect Name List to view the complete list of all types and special effect names.
Different entities and special effects in the SDK require calling different APIs.
9.1. Timeline built-in special effects
No need to download, you can use it directly by name, affecting all tracks and clips in the entire timeline within the specified time.
const raw = timeline.addBuiltinTimelineVideoFx(
0,// inpoint
5000000,// duration
'Mosaic',// Effect name can be replaced
);
9.2. Video track built-in transition effects
Built-in transitions don't require downloading, but there are some prerequisites: the track must contain at least two clips with no time gap between them. Under these conditions, you can set transitions using the following methods:
// The first parameter is the segment index, the second is the transition name
videoTrack.setBuiltinTransition(0, 'Fade');
9.3. Timeline package special effects
Requires the installation of the effects package. You can directly copy the installAsset method in Asset Installation to install it. Like the built-in effects in the timeline, this affects all tracks and clips in the entire timeline within the specified timeframe.
const xxx = await installAsset(path)
const videoFx = timeline.addPackagedTimelineVideoFx(
0,// inpoint
5000000,// duration
'xxx',// Special effects package ID
);
9.4. Video track package effects
The special effects package needs to be installed. The installation method can be directly copied from the installAsset method in Resource Installation.
const videoFx = videoTrack.addPackagedTrackVideoFx(
0,// inpoint
5000000,// duration
'xxx',// Special effects package ID
);
9.5. Built-in effects for audio clips
No need to download, just use it directly. Note that the main body is the audio clip, not the audio track.
const audioFx = audioClip.appendFx('Audio Reverb')
9.6. Property effects
In the Effect Name List, all effects with a table (including parameters, type, maximum value, minimum value, default value, description, and other properties) are considered attribute effects. Some attribute effects require asset packs, while others can be used directly. You can distinguish them by the "Description" in the table information: if the "Description" contains "xxxID", it means that the effect requires an asset pack.
Before setting property effects, you need to do the following:
videoClip.enablePropertyVideoFx(true) // Enable video clip attribute effects
const fx = videoClip.getPropertyVideoFx() // Get the prototype of special effects for video clips
const audioFx = audioClip.appendFx('xxx') // Add and get audio clip effect prototype
Depending on the parameter type, attribute effects require calling different APIs. The main types are:
Type | API |
---|---|
STRING | setStringVal |
BOOL | setBooleanVal |
FLOAT | setFloatVal |
Menu | setMenuVal |
INT | setIntVal |
COLOR | setColorVal |
The above setAPIs are all methods in NvsFx, that is, methods in the special effect prototype obtained through getPropertyVideoFx. All the above methods require two parameters. The first parameter is the parameter in the special effect table, and the second is the value. You can set it by the following method:
fx.setStringVal('Package Id','xxx') // To set the filter package, you need to install the resource package first and then perform this operation
fx.setBooleanVal('Beauty Effect',true) // Turn on beauty effects
fx.setFloatVal('Beauty Whitening',0.8) // Whitening intensity 80%
fx.setMenuVal('Fill Mode', 'Stretch') // Set the screen fill mode to stretch
fx.setIntVal('Advanced Beauty Type',1)// To set the advanced beauty type, you need to first configure 'Advanced Beauty Enable' to enable advanced beauty
fx.setColorVal('Makeup Lip Color',new NvsColor(1,0,0,1)) // Beauty lipstick color. NvsColor parameters are 0-1, respectively r, g, b, a
10. Resource Installation
The SDK needs to install resources into memory before they can be used. Therefore, before using special effects packages and material packages, you need to download and install the resources first. You can use the FS object provided by the SDK. The following is the complete encapsulation method:
export async function installAsset(packageUrl: string) {
let res = await fetch(packageUrl);
const packageInFS = '/' + packageUrl.split('/').pop();
await FS.writeFile(packageInFS, new Uint8Array(await res.arrayBuffer()));
const list = packageInFS.split('.');
const assetUuid = list[0].split('/').pop();
const suffix = list.pop();
// Different package types are distinguished by different package suffixes. Note: special effect packages are not allowed to modify their names.
let assetType = undefined;
if (suffix === 'videofx') {
assetType = NvsAssetPackageTypeEnum.VideoFx;
} else if (suffix === 'captionstyle') {
assetType = NvsAssetPackageTypeEnum.CaptionStyle;
} else if (suffix === 'animatedsticker') {
assetType = NvsAssetPackageTypeEnum.AnimatedSticker;
} else if (suffix === 'videotransition') {
assetType = NvsAssetPackageTypeEnum.VideoTransition;
} else if (suffix === 'makeup') {
assetType = NvsAssetPackageTypeEnum.Makeup;
} else if (suffix === 'facemesh') {
assetType = NvsAssetPackageTypeEnum.FaceMesh;
} else if (suffix === 'captionrenderer') {
assetType = NvsAssetPackageTypeEnum.CaptionRenderer;
} else if (suffix === 'captioncontext') {
assetType = NvsAssetPackageTypeEnum.CaptionContext;
} else if (suffix === 'template') {
assetType = NvsAssetPackageTypeEnum.Template;
}
return new Promise((resolve, reject) => {
if (assetType === undefined) {
reject(assetUuid || '');
return;
}
// Check the package status. If it is already installed, you do not need to install it again.
const status = nvsGetStreamingContextInstance()
.getAssetPackageManager()
.getAssetPackageStatus(assetUuid || '', assetType);
if (status !== NvsAssetPackageStatusEnum.NotInstalled) {
resolve(assetUuid || '');
return;
}
// Determine whether the installation package is successful through the callback
nvsGetStreamingContextInstance().getAssetPackageManager().onFinishAssetPackageInstallation =
(assetPackageId, assetPackageFilePath, assetPackageType, error) => {
FS.unlink(assetPackageFilePath, 0);
// Error 0 indicates success
if (error === 0 && assetPackageId === assetUuid) {
resolve(assetUuid || '');
} else {
reject(assetUuid || '');
}
};
nvsGetStreamingContextInstance()
.getAssetPackageManager()
.installAssetPackage(packageInFS, '', assetType);
});
}
All special effects packages, resource packages, model packages, etc. mentioned in this document can be obtained by contacting the business department on the Meishes official website.
11. Network resources
Generally speaking, the resources on the track (video, audio, pictures, etc.) need to be transcoded (m3u8), but SDK also provides a way to load resources through the network without transcoding.
Compared to fully supported transcoding resources, network resources support fewer types and require you to obtain specific information from the resources.
There are many related libraries on the market. The most recommended one is mediabuuny. The code example below uses the mediabuuny library to obtain resource information.
Of course, if you are familiar with other libraries, you can replace them as long as your final data meets the IMetaData format.
The support for network resources is not as good as that for transcoded resources. Currently, we use the mediabuuny library for testing. The supported resource formats are as follows:
Video Format:
- mp4
- m4v
- 3gp
- mov
Audio Format:
- mp3
- wav
- m4a
The sample code is as follows:
interface IVideoStream {
width: number;
height: number;
duration: number;
}
interface IAudioStream {
duration: number;
channelCount: number;
sampleRate: number;
}
enum EMediaType {
audio = 'audio',
video = 'video',
}
interface IMetaData {
audioStreams: IAudioStream[];
bitrate: number;
duration: number;
mediaType: EMediaType;
videoStreams: IVideoStream[];
webAssetUrl: string;
webLocalFileId: string;
}
async function afterInitialize(resourceData: {
type: 'audio' | 'video';
resourceUrl: string;
}) {
// Create a memory resource folder
FS.mkdir('/localmedia');
// You can freely make up the uuid, which is essentially a string of characters
let uuid = generateUUID();
const webLocalPath = `/localmedia/${uuid}=.weblocal`;
const defaultMetaData = '{"mediaType":"video","webLocalFileId":"","webAssetUrl":"https://alieasset.meishesdk.com/test/resource/video/2025/09/01/92493/0d6b19e4d57b4fe0a7b7df0e3430a801.mp4","duration":30000000,"bitrate":493820,"videoStreams":[{"duration":30000000,"width":864,"height":480}],"audioStreams":[{"duration":30000000,"sampleRate":44100,"channelCount":2}]}';
const nvStreamingContext = nvsGetStreamingContextInstance();
await nvStreamingContext.streamingEngineReadyForTimelineModification();
// Clear the audio and video tracks in the timeline. You can delete this code based on your needs.
let videocount = timeline.videoTrackCount();
for (let i = videocount; i > 0; i--) {
let res = timeline.removeVideoTrack(i - 1);
}
let audiocount = timeline.audioTrackCount();
for (let i = 0; i < audiocount; i++) {
timeline.removeAudioTrack(i);
}
let webLocalData: IMetaData = JSON.parse(defaultMetaData);
// Use mediabuuny library to support audio and video
webLocalData = await uploadFile(
resourceData.resourceUrl
resourceData.type
);
const webLocalString = JSON.stringify(webLocalData);
FS.writeFile(webLocalPath, webLocalString);
if (webLocalData.mediaType === 'video') {
const videoTrack = timeline.appendVideoTrack();
videoTrack.addClipWithSpeedExt2(
webLocalPath,
0,
webLocalData.duration,
0,
webLocalData.duration,
1,
true,
);
} else {
const videoTrack = timeline.appendVideoTrack();
videoTrack.appendClip2(
':/footage/transparent_black.png',
0,
webLocalData.duration,
);
const audioTrack = timeline.appendAudioTrack();
audioTrack.addClipWithSpeedExt2(
webLocalPath,
0,
webLocalData.duration,
0,
webLocalData.duration,
1,
true,
);
}
nvStreamingContext.seekTimeline(
timeline,
0,
NvsVideoPreviewSizeModeEnum.LiveWindowSize,
NvsSeekFlagEnum.BuddyHostVideoFrame
);
}
async function uploadFile(resource: File | string, type: 'audio' | 'video') {
let webAssetUrl = resource instanceof File ? '' : resource;
const source =
resource instanceof File
? new BlobSource(resource)
: new UrlSource(resource);
const input = new Input({
source,
formats: ALL_FORMATS, // Accept all formats
});
let mediaType = type as EMediaType;
let duration = await input.computeDuration().then((res) => res * 1000000);
let tracks = await input.getTracks();
let bitrate = 493820;
let audioStream = {
channelCount: 0,
duration: 0,
sampleRate: 0,
};
let videoStream = {
duration: 0,
width: 0,
height: 0,
};
let hasAudioTrack = false;
for (let i = 0; i < tracks.length; i++) {
let track = tracks[i];
let duration = await track
.computeDuration()
.then((second) => getFrameTime(second * 1000000));
shortDelay()
.then(() => track.computePacketStats())
.then((stats) => {
bitrate = stats.averageBitrate;
});
if (track.isVideoTrack()) {
let height = track.codedHeight;
let width = track.codedWidth;
videoStream = {
duration,
width,
height,
};
} else if (track.isAudioTrack()) {
hasAudioTrack = true;
let channelCount = track.numberOfChannels;
let sampleRate = track.sampleRate;
audioStream = {
channelCount,
duration,
sampleRate,
};
}
}
const metaRealData: IMetaData = {
audioStreams: hasAudioTrack ? [audioStream] : [],
bitrate,
duration: getFrameTime(duration),
mediaType,
videoStreams: mediaType === 'video' ? [videoStream] : [],
webAssetUrl,
webLocalFileId: '',
};
return metaRealData;
}
12. Local Resources
After observing the usage scenario of network resources, you should find that among the parameters accepted by the uploadFile method of network resources, resource supports the File type.
From the SDK's perspective, as long as you write the resource information into the FS in the correct format, the SDK does not care whether it is a network or local resource.
Therefore, the code of network resources can be slightly modified to realize local resources.
You can use input to get a file, or use the upload component of the UI library to get it. I believe these operations are not difficult for you, so I will not go into details here.
13. Transcoding
The transcoding service is an important part of the SDK's complete ecosystem. Transcoded resources can obtain the highest support (of course, you can also use network/local resources to get on track).
Video resources, audio resources, image resources, etc. are transcoded to generate new m3u8 resources. The m3u8 here is not a streaming file in the general sense. The internal format is different. The standardized m3u8 file cannot be used directly in the SDK.
If you need complete transcoding services, you can contact Meishe Business on the Meishe official website.
If you need a complete audio and video solution, you can also contact Meishe business at Meishe official website.