MeiCam SDK For Web
3.14.2
|
Meishe SDK is committed to solving the technical threshold of web video development, so that programmers with only web interface development experience can develop video recording and editing functions with excellent performance and rich rendering effects. Our advantages are reflected in:
Cross-Origin-Opener-Policy: same-origin Cross-Origin-Embedder-Policy: require-corpTake the development environment as an example
// vite.config.js export default defineConfig({ plugins: [ ..., { name: "configure-response-headers", configureServer: (server) => { server.middlewares.use((_req, res, next) => { res.setHeader("Cross-Origin-Embedder-Policy", "require-corp"); res.setHeader("Cross-Origin-Opener-Policy", "same-origin"); next(); }); }, }, ], ... })
If there is no MeisheSDK library, please contact Meishe Business first to obtain the SDK and authorization:
<script src="https://alieasset.meishesdk.com/NvWasm/domain/13/NvEffectSdk.js"></script>
npm i meishewasmloader --save
import { WASMLoader } from 'meishewasmloader'; const wasmLoader = WASMLoader({ // Loading progress callback showLoader: function (state) {}, //Failure callback showError(errorText: string) {}, // Success callback loadingFinished() {}, }); wasmLoader.loadEmscriptenModule("https://alieasset.meishesdk.com/NvWasm/domain/13/"); // Start calling function loading
//Callback function for authorization verification nvStreamingContext.onWebRequestAuthFinish = (success) => { if (!success) { console.error('SDK authentication failed'); } else { console.error('SDK authentication successful'); } }; nvStreamingContext.verifySdkLicenseFile('Authentication Address')
NvsStreamingContext is the streaming context class of Meishe SDK and can be regarded as the entrance to the entire SDK framework.
<canvas id="live-window" />
var canvas = document.getElementById("live-window"); canvas.width = canvas.clientWidth * window.devicePixelRatio; canvas.height = canvas.clientHeight * window.devicePixelRatio;
The aspect ratio of NvsLiveWindow should be 1:1, 4:3, 16:9, 9:16, etc., preferably consistent with the aspect ratio of the image to be edited. Otherwise, the previewed image is the cropped image.
declare const NvsLiveWindowFillModeEnum: Readonly<{ //The image is filled evenly proportionally and cropped if necessary (default mode) PreserveAspectCrop: 0; //The image is scaled evenly to fit the window, without cropping PreserveAspectFit: 1; //The image is scaled to fit the window Stretch: 2; }
For the three filling modes, the pictures are shown below:
The FS class is the resource management class of the SDK and is mainly responsible for loading, creating, deleting resources, etc. Through FS.mkdir(), FS.readFile(), FS.writeFile() and other methods to manage resources. For more methods, please reference
const response = await fetch('https://alieasset.meishesdk.com/editor/2022/07/05/video/afd62303-3492-4c31-b09c-1c56c63b46a2/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8') ; const text = await response.text(); const path = `/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8` FS.writeFile(path, text)
FS.mkdir('/m3u8') const path = `/m3u8/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8.m3u8` FS.writeFile(path, text)
General steps to implement video editing:
const nvStreamingContext = nvsGetStreamingContextInstance();
//NvsLiveWindow initialization const nvLiveWindow = nvStreamingContext.createLiveWindow("live-window"); //Set fill mode nvLiveWindow.setFillMode(NvsLiveWindowFillModeEnum.PreserveAspectFit);
//Connect the timeline to the streaming window nvStreamingContext.connectTimelineWithLiveWindow(timeline, nvLiveWindow);
// html <canvas id="live-window" /> //js const liveWindow = document.getElementById("live-window"); const width = 960; const height = 540 liveWindow.style.height = height + 'px'; liveWindow.style.width = width + 'px'; liveWindow.height = height * (window.devicePixelRatio || 1); liveWindow.width = width * (window.devicePixelRatio || 1); //Identification of timeline creation const TIMELINE_FLAGS = NvsCreateTimelineFlagEnum.DontAddDefaultVideoTransition + NvsCreateTimelineFlagEnum.ForceAudioSampleFormatUsed + NvsCreateTimelineFlagEnum.RecordingUserOperation + NvsCreateTimelineFlagEnum.SyncAudioVideoTrasitionInVideoTrack /*Create timeline*/ nvTimeline = nvStreamingContext.createTimeline( new NvsVideoResolution(width, height), newNvsRational(25, 1), new NvsAudioResolution(44100, 2), TIMELINE_FLAGS, );
createTimeline(videoRes: NvsVideoResolution, fps: NvsRational, audioRes: NvsAudioResolution, flags?: number): NvsTimeline;
Generally, you create a video track and then add pictures or video material to the track. The material added to the track is called a clip. Pictures and video footage are added to the track via file paths. Please note: If the size of the picture material is too large, you need to reduce the size of the picture. The size of the reduced picture will best match the size of the resolution used to create the timeline.
const videoTrack = nvTimeline.appendVideoTrack();
const audioTrack = nvTimeline.appendAudioTrack();
const response = await fetch('https://alieasset.meishesdk.com/editor/2022/07/05/video/afd62303-3492-4c31-b09c-1c56c63b46a2/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8') ; const text = await response.text(); const path = `/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8` FS.writeFile(path, text) const clip = videoTrack.appendClip(path);
Preview effect reference Video positioning preview
playbackTimeline(timeline: NvsTimeline, startTime: number, endTime: number, videoSizeMode: number, preload: boolean, flags: number): boolean;
nvStreamingContext.playbackTimeline( nvTimeline, // current timeline object nowTime, // start time endTime, // End time -1 means playing to the end NvsVideoPreviewSizeModeEnum.LiveWindowSize, //Video size mode preload, // Whether to preload, set to true (flags |= NvsPlaybackFlagEnum.BuddyHostOriginVideoFrame) // Playback flag );
seekTimeline(timeline: NvsTimeline, timestamp: number, videoSizeMode: number, flags: number): boolean;
nvStreamingContext.seekTimeline( nvTimeline, // current timeline object nowTime, // start time NvsVideoPreviewSizeModeEnum.LiveWindowSize, //Video preview size mode (flags |= NvsSeekFlagEnum.BuddyHostOriginVideoFrame), // Positioning flag );
const clip = videoTrack.getClipByIndex(0); clip.changeTrimInPoint(trim_in, true); clip.changeTrimOutPoint(trim_out, true);
videoTrack.removeClip(0, false);
videoTrack.moveClip(0,3);
const assetUrl = https://alieasset.meishesdk.com/test/2023/07/10/image/0c3c43a5-f9f8-4223-84f2-2c35c535f104/0c3c43a5-f9f8-4223-84f2-2c35c535f104.m3u8 const response = await fetch(assetUrl); const text = await response.text(); const path = `/0c3c43a5-f9f8-4223-84f2-2c35c535f104.m3u8` FS.writeFile(path, text) videoTrack.appendClip2(path,0,8000000);
Create a timeline, add video tracks and audio tracks, and remove them if they are no longer needed. Here's how to do it:
nvStreamingContext.removeTimeline(nvTimeline);
nvTimeline.removeVideoTrack(0);
nvTimeline.removeAudioTrack(0);
For preview effect, please refer to Video Positioning Preview
const assetUrl = 'https://alieasset.meishesdk.com/test/2024/05/24/audio/6dc60190-c22b-4740-a299-3981d1a8c7ec/6dc60190-c22b-4740-a299-3981d1a8c7ec.m3u8' const audioTrack = nvTimeline.appendAudioTrack(); const response = await fetch(assetUrl); const text = await response.text(); const path = `/6dc60190-c22b-4740-a299-3981d1a8c7ec.m3u8` FS.writeFile(path, text) audioTrack.appendClip(path);
const clip = audioTrack.getClipByIndex(0); clip.changeTrimInPoint(trim_in, true); clip.changeTrimOutPoint(trim_out, true);
For preview effect, please refer to Video Positioning Preview
Adding, deleting and obtaining subtitles are all performed on the timeline. Call addCaption(). You can refer to the addCaption() method of the SdkDemo example.
const timelineCapion = nvTimeline.addCaption("Meishe SDK", 0, nvTimeline.getDuration(), captionStylePackageId,false);
captionStylePackageId is the material style package id. Before adding subtitles, you need to install the subtitle style package. Refer to Material Package Management, and try the subtitle style package Good Species Grass
const fontUrl = 'https://alieasset.meishesdk.com/font/stationcoolblackbody.ttf' const response = await Axios.get(fontUrl, { responseType: 'arraybuffer', }) const fontInFS = '/' + fontUrl.split('/').pop() await FS.writeFile(fontInFS, new Uint8Array(response.data)) // Set font for subtitles caption.setFontByFilePath(fontInFS)
const nextCaption = nvTimeline.removeCaption(timelineCapion);
//Get the first subtitle on the timeline const firstCaption = nvTimeline.getFirstCaption(); //Get the last subtitle on the timeline const lastCaption = nvTimeline.getLastCaption(); //Get the previous subtitle of the current subtitle on the timeline const prevCaption = nvTimeline.getPrevCaption(currentCaption); //Get the next subtitle of the current subtitle on the timeline const nextCaption = nvTimeline.getNextCaption(currentCaption);
Obtain subtitles based on the position on the timeline and return a List collection that saves the subtitles at the current position. The sorting rules of the obtained subtitle list are as follows:
const cpationList = nvTimeline.getCaptionsByTimelinePosition(1000000);
Modifying subtitle properties can be achieved through the methods of the NvsTimelineCaption class. After obtaining the subtitles, you can set the subtitle text, color, bold, italics, stroke, etc.
currentCaption.setText("Meishe SDK");
//Change entry point currentCaption.changeInPoint(1000000); //Change the out point currentCaption.changeOutPoint(5000000); //Change the display position (the in point and the out point are offset by the offset value at the same time) currentCaption.movePosition(1000000);
For preview effect, please refer to Video Positioning Preview
Adding, deleting and obtaining animated stickers are also performed on the timeline. You can refer to the sticker module of the SdkDemo example.
nvTimeline.addAnimatedSticker(0, nvTimeline.getDuration(), stickerId);stickerId is the animated sticker package ID. Before adding animated stickers, you need to install the animated sticker package. Refer to Material Package Management, and try out the sticker Cream Cake
const nextSticker = nvTimeline.removeAnimatedSticker(currentSticker);
//Get the first animated sticker on the timeline const firstSticker = nvTimeline.getFirstAnimatedSticker(); //Get the last animated sticker on the timeline const lastSticker = nvTimeline.getLastAnimatedSticker(); //Get the previous animated sticker of the current animated sticker in the timeline const prevSticker = nvTimeline.getPrevAnimatedSticker(currentSticker); //Get the next animated sticker of the current animated sticker in the timeline const nextSticker = nvTimeline.getNextAnimatedSticker(currentSticker);
Get the animated sticker based on the position on the timeline and return a List collection that saves the animated sticker object at the current position. The sorting rules of the obtained animated sticker list are as follows:
const stickerList = nvTimeline.getAnimatedStickersByTimelinePosition(1000000);
Modifying sticker properties can be achieved through the methods of the NvsTimelineAnimatedSticker class. After obtaining the sticker, you can set the zoom value, horizontal flip, rotation angle, translation, etc.
currentSticker.setScale(1.2);
currentSticker.setCenterPolarAngle(1.2);
//Change entry point currentSticker.changeInPoint(1000000); //Change the out point currentSticker.changeOutPoint(5000000); //Change the display position (the in point and the out point are offset by the offset value at the same time) currentSticker.movePosition(1000000);
For preview effect, please refer to Video Positioning Preview
The clip timeline is added on the video track. You can refer to the addTimelineClip method of sdkDemo.
const timeline = nvStreamingContext.createTimeline( new NvsVideoResolution(width, height), newNvsRational(25, 1), new NvsAudioResolution(44100, 2), TIMELINE_FLAGS, );
const videoTrack = timeline.appendVideoTrack();
const clip = videoTrack.appendClip(filePath);
const timelineClip = videoTrack.addTimelineClip(timeline,0);
addTimelineClip(timeline: NvsTimeline, inPoint: number): NvsVideoClip;
For preview effect, please refer to Video Positioning Preview
Video transitions include inline transitions and wrapped transitions. Set video embedded transition: refer to setBuiltinTransition
videoTrack.setBuiltinTransition(0,transitionName);
videoTrack.setPackagedTransition(1, transitionId);transitionId is the video package transition ID. Before adding a video package transition, you need to install the video package transition. Refer to Material Package Management and try out the transition 3D Rotation 03
For preview effect, please refer to Video Positioning Preview
In subsequent video editing, several special effects are often used, namely video special effects (NvsVideoFx), audio special effects (NvsAudioFx), and timeline video special effects (NvsTimelineVideoFx).
Video special effects are used on video clips, and several video special effects can be added to each video clip. Video special effects include embedded video special effects, wrapped video special effects, and beauty effects.
videoClip.appendBuiltinFx(fxName);
videoClip.appendPackagedFx(fxPackageId);fxPackageId is the video special effects package ID. Before adding the video special effects package, you need to install the video special effects package. Refer to Material Package Management to try out the transition vertical mirror
Removing video special effects includes removing special index value effects and removing all video special effects.
videoClip.removeFx(0);
videoClip.removeAllFx();
Audio special effects are used on audio clips, and several audio special effects can be added to each audio clip.
audioClip.appendFx(fxName);
audioClip.removeFx(0);
time Online video special effects are a kind of special effects used on the timeline, including inline special effects and wrapped special effects. Several timeline video effects can be added to the timeline.
Add timeline effects:
nvTimeline.addBuiltinTimelineVideoFx(1000000,5000000,fxName);
nvTimeline.addPackagedTimelineVideoFx(1000000,5000000,fxPackageId);fxPackageId is the video special effects package ID. Before adding the timeline special effects package, you need to install the timeline special effects first. Please refer to Material Package Management
//Get the first timeline video effects on the timeline const firstTimelineFx = nvTimeline.getFirstTimelineVideoFx(); //Get the last timeline video effects on the timeline const lastTimelineFx = nvTimeline.getLastTimelineVideoFx(); //Get the previous timeline video effects of a certain timeline video effect on the timeline const prevTimelineFx = nvTimeline.getPrevTimelineVideoFx(currentTimelineFx); //Get the next timeline video effects of a certain timeline video effect on the timeline const nextTimelineFx = nvTimeline.getNextTimelineVideoFx(currentTimelineFx);
Obtain the timeline video effects based on the position on the timeline and return an array of timeline video effects objects at the current position. The sorting rules of the obtained timeline video special effects array are as follows:
The entry points are the same when adding, and they are arranged in the order in which they are added to the timeline video effects.
const timelineFxArray = nvTimeline.getTimelineVideoFxByTimelinePosition(2000000);
//Change entry point currentTimelineFx.changeInPoint(1000000); //Change the out point currentTimelineFx.changeOutPoint(5000000); //Change the display position (the in point and the out point are offset by the offset value at the same time) currentTimelineFx.movePosition(1000000);For preview effect, please refer to Video Positioning Preview
Meishe SDK provides a rich material library, including animated stickers, themes, subtitle styles, transitions, etc. Material packages can be downloaded from the Internet or provided by the Meishe SDK project team. Users can choose to use these material packages according to their needs. Meishe SDK manages these material packages through the NvsAssetPackageManager class. You can install, uninstall, and obtain the status and version number of the material packages, etc. For details, refer to NvsAssetPackageManager
import axios from 'axios'; const packageUrl = "https://qasset.meishesdk.com/material/pu/animatedsticker/A1509C3D-7F5C-43CB-96EE-639ED7616BB7/A1509C3D-7F5C-43CB-96EE-639ED7616BB7.1.animatedsticker"; const response = await axios.get(packageUrl, { responseType: 'arraybuffer' }); const packageFile = '/' + packageUrl.split('/').pop(); await FS.writeFile(packageFile, new Uint8Array(response.data)); let packageLicenseFile = ''; let packageType = NveAssetPackageTypeEnum.AnimatedSticker; const assetPackageManager = nvsGetStreamingContextInstance().getAssetPackageManager(); assetPackageManager.onFinishAssetPackageInstallation = (assetPackageId, assetPackageFilePath, assetPackageType, error) => {}; assetPackageManager.installAssetPackage(packageFile, packageLicenseFile, packageType);
const error = nvStreamingContext.getAssetPackageManager().uninstallAssetPackage(stickerId,NvsAssetPackageTypeEnum.AnimatedSticker);
Meishe SDK provides many callback interfaces. If you want to query the status of the collection device, collection recording status, video playback status, file generation status, resource package installation status, etc., you must set the callback and implement the corresponding callback method after creating the NvsStreamingContext object. ReferencenvStreamingContext
nvStreamingContext.onPlaybackTimelinePosition = (timeline, position) => { console.log(`Current playback position ${position}`); }
For the sake of editing efficiency, the web side does not use the original video for editing, but uses a low-resolution video. To synthesize the video, you need to call the back-end interface to generate the video. You can refer to the exportVideo() method in sdkDemo.
For more functions, please refer to: https://www.meishesdk.com/cloudedit, which will be introduced in detail.