MeiCam SDK For Web  3.14.2
Detailed development guide

details

1 Overview

Meishe SDK is committed to solving the technical threshold of web video development, so that programmers with only web interface development experience can develop video recording and editing functions with excellent performance and rich rendering effects. Our advantages are reflected in:

  • No time limit for recording and editing
  • The highest standard of beauty effect in the industry
  • Real-time preview of editing and variable speed without transcoding
  • Picture and video mixing
  • The generated video supports up to 1080p
  • Rich transitions, filters, and subtitle styles
  • Exclusive themes
  • Support 4K production and generation
  • Open custom sticker function

1.1 supported formats

  • Input specifications:
    • Video formats: MP4, MKV, TS, MTS, M2TS, AVI, FLV, MOV, M2P, MPG, RM, RMVB, WMV, VOB, 3GP, GIF, MXF, M4V, WEBM
    • Audio formats: MP3, WAV, AAC, AC3, M4A, MXF, FLAC, OPUS,
    • Image formats: PNG, JPG, JPEG, JPE, BMP, TGA, PSD, WEBP,
    • Video encoding: H264, WMV, MPEG4
    • Audio encoding: MP3, AAC, PCM, FLAC
  • Output specifications:
    • Video formats: MP4, MOV, WEBM, MXF, MPG, AVI
    • Video encoding: H264, MPEG2-IBP
    • Audio encoding: AAC, MP3
  • Expansion pack format (extension pack is a content package used by Meishe to carry expanded materials, including subtitles, filters, stickers, etc.):
    • Subtitles: .captionstyle
    • Compound subtitles: .compoundCaption
    • Caption: .captionrenderer
    • Bubble: .captioncontext
    • Into animation: captioninanimation
    • Out animation: captionoutanimation
    • Combined animation: .captionanimation
    • Filter: .videofx
    • Sticker: .animatedsticker
    • Stickers into animation: .animatedstickerinanimation
    • Sticker animation: .animatedstickeroutanimation
    • Sticker combination animation: .animatedstickeranimation
    • Transition: .videotransition
    • Template: .template
    • Project: .project

1.2 Notes

  1. The operating environment is as follows: Almost all mainstream browsers such as Chrome, FireFox, Edge, and Safari support it.
  2. Meishe sdk uses SharedArrayBuffer technology. For the development environment and formal deployment environment of the website, it is required Add the following two headers to achieve cross-origin isolation of the website
      Cross-Origin-Opener-Policy: same-origin
      Cross-Origin-Embedder-Policy: require-corp
    
    Take the development environment as an example
      // vite.config.js
      export default defineConfig({
              plugins: [
                      ...,
                      {
                              name: "configure-response-headers",
                              configureServer: (server) => {
                                      server.middlewares.use((_req, res, next) => {
                                      res.setHeader("Cross-Origin-Embedder-Policy", "require-corp");
                                      res.setHeader("Cross-Origin-Opener-Policy", "same-origin");
                                      next();
                              });
                              },
                      },
              ],
              ...
      })
    
  3. For formal deployment environments, Security Context must also be used to support SharedArrayBuffer technology

2 Quick access

2.1 Import library files

If there is no MeisheSDK library, please contact Meishe Business first to obtain the SDK and authorization:

  1. Introduce Meishe SDK interface file NvEffectSdk.js
      <script src="https://alieasset.meishesdk.com/NvWasm/domain/13/NvEffectSdk.js"></script>
    
  2. Use the loading tool provided by Meishe to load wasm related files
  • (a) Install meishewasmloader Reference
      npm i meishewasmloader --save
    
  • (b) Use WASMLoader to load wasm related files
      import { WASMLoader } from 'meishewasmloader';
      const wasmLoader = WASMLoader({
              // Loading progress callback
              showLoader: function (state) {},
              //Failure callback
              showError(errorText: string) {},
              // Success callback
              loadingFinished() {},
      });
      wasmLoader.loadEmscriptenModule("https://alieasset.meishesdk.com/NvWasm/domain/13/"); // Start calling function loading
    
Note
https://alieasset.meishesdk.com/NvWasm/domain/13/ The address is the public test address provided by Meishe. If the customer's product is ready to be released online, you need to contact the business to obtain the SDK related files and privatize the deployment to the customer's environment. , replace this address and use it again;
When privatizing wasm-related files, configure brotli or gzip to compress the wasm file to help improve the loading speed;
The addresses in steps 1 and 2 need to be consistent, otherwise unpredictable results will occur;

2.2 Authorization verification

  • The default watermark will only be removed after passing authorization verification
      //Callback function for authorization verification
      nvStreamingContext.onWebRequestAuthFinish = (success) => {
              if (!success) {
                      console.error('SDK authentication failed');
              } else {
                      console.error('SDK authentication successful');
              }
      };
      nvStreamingContext.verifySdkLicenseFile('Authentication Address')
    

3 Instructions for use

3.1 Core concepts

3.1.1 NvsStreamingContext class

NvsStreamingContext is the streaming context class of Meishe SDK and can be regarded as the entrance to the entire SDK framework.

3.1.2 Preview using NvsLiveWindow class

  • The creation of the NvsLiveWindow class requires the use of html canvas tags and html code
      <canvas id="live-window" />
    
Note
Canva tag rendering video will be blurry. You need to set it like this in js.
  var canvas = document.getElementById("live-window");
  canvas.width = canvas.clientWidth * window.devicePixelRatio;
  canvas.height = canvas.clientHeight * window.devicePixelRatio;

The aspect ratio of NvsLiveWindow should be 1:1, 4:3, 16:9, 9:16, etc., preferably consistent with the aspect ratio of the image to be edited. Otherwise, the previewed image is the cropped image.

  • Fill mode of NvsLiveWindow: see NvsLiveWindowFillModeEnum
      declare const NvsLiveWindowFillModeEnum: Readonly<{
              //The image is filled evenly proportionally and cropped if necessary (default mode)
              PreserveAspectCrop: 0;
              //The image is scaled evenly to fit the window, without cropping
              PreserveAspectFit: 1;
              //The image is scaled to fit the window
              Stretch: 2;
      }
    

For the three filling modes, the pictures are shown below:

  • PreserveAspectCrop mode:
  • PreserveAspectFit mode:
  • Stretch mode:

3.1.3 FS class

The FS class is the resource management class of the SDK and is mainly responsible for loading, creating, deleting resources, etc. Through FS.mkdir(), FS.readFile(), FS.writeFile() and other methods to manage resources. For more methods, please reference

3.1.4 Use of resources

  • Meishe supports uploading videos to the cloud and provides corresponding material management interfaces. We will transcode the uploaded materials to generate files in m3u8 format to facilitate network transmission.
  • Files are written into memory through FS objects, taking videos or pictures as an example:
      const response = await fetch('https://alieasset.meishesdk.com/editor/2022/07/05/video/afd62303-3492-4c31-b09c-1c56c63b46a2/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8') ;
      const text = await response.text();
      const path = `/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8`
      FS.writeFile(path, text)
    
Note
path is the path of the material. Subsequent SDK operations are based on this path. text is the binary data of the material.
  • In order to facilitate the management of different materials, directories will also be created, taking videos or pictures as an example:
      FS.mkdir('/m3u8')
      const path = `/m3u8/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8.m3u8`
      FS.writeFile(path, text)
    

3.2 Edit

General steps to implement video editing:

  1. First initialize NvsStreamingContext.
      const nvStreamingContext = nvsGetStreamingContextInstance();
    
  2. Create a timeline. The timeline is created through the NvsStreamingContext class object. The timeline takes up less resources. If necessary, multiple timelines can be created in one program. Generally, just create a timeline.
  3. Create a streaming window (liveWindow) Create a streaming window through the NvsStreamingContext class object. The streaming window is used to display video images.
      //NvsLiveWindow initialization
      const nvLiveWindow = nvStreamingContext.createLiveWindow("live-window");
      //Set fill mode
      nvLiveWindow.setFillMode(NvsLiveWindowFillModeEnum.PreserveAspectFit);
    
  4. Connect the timeline to the streaming window
      //Connect the timeline to the streaming window
      nvStreamingContext.connectTimelineWithLiveWindow(timeline, nvLiveWindow);
    
  5. Add tracks, including video track (VideoTrack) and audio track (AudioTrack). Video tracks and audio tracks can be added to the timeline, with video clips added to the video track and audio clips added to the audio track. If you want to implement picture-in-picture, you can add two video tracks. Adding audio tracks is generally used to add music or dubbing functions to videos.
  6. Add the clip to the track. Multiple clips can be added to the video track, which can be video files or pictures, to achieve mixed editing of pictures and videos. Multiple music files can also be added to the audio track.

3.2.1 Create timeline and set video size

  • Creating a timeline is very critical for editing. The resolution of the timeline determines the maximum resolution (size) of the generated video file. Please adapt the resolution of the timeline to the aspect ratio of NvsLiveWindow.
      // html
      <canvas id="live-window" />
      //js
      const liveWindow = document.getElementById("live-window");
      const width = 960;
      const height = 540
      liveWindow.style.height = height + 'px';
      liveWindow.style.width = width + 'px';
      liveWindow.height = height * (window.devicePixelRatio || 1);
      liveWindow.width = width * (window.devicePixelRatio || 1);
    
    
      //Identification of timeline creation
      const TIMELINE_FLAGS =
      NvsCreateTimelineFlagEnum.DontAddDefaultVideoTransition +
      NvsCreateTimelineFlagEnum.ForceAudioSampleFormatUsed +
      NvsCreateTimelineFlagEnum.RecordingUserOperation +
      NvsCreateTimelineFlagEnum.SyncAudioVideoTrasitionInVideoTrack
      /*Create timeline*/
      nvTimeline = nvStreamingContext.createTimeline(
              new NvsVideoResolution(width, height),
              newNvsRational(25, 1),
              new NvsAudioResolution(44100, 2),
              TIMELINE_FLAGS,
      );
    
  • Method reference createTimeline() method.
       createTimeline(videoRes: NvsVideoResolution, fps: NvsRational, audioRes: NvsAudioResolution, flags?: number): NvsTimeline;
    
  • Parameters

3.2.2 Mixing of multiple videos and pictures

Generally, you create a video track and then add pictures or video material to the track. The material added to the track is called a clip. Pictures and video footage are added to the track via file paths. Please note: If the size of the picture material is too large, you need to reduce the size of the picture. The size of the reduced picture will best match the size of the resolution used to create the timeline.

  • Add video track:
      const videoTrack = nvTimeline.appendVideoTrack();
    
  • Add audio track:
      const audioTrack = nvTimeline.appendAudioTrack();
    
  • Add a fragment: refer to Resource Usage
      const response = await fetch('https://alieasset.meishesdk.com/editor/2022/07/05/video/afd62303-3492-4c31-b09c-1c56c63b46a2/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8') ;
      const text = await response.text();
      const path = `/afd62303-3492-4c31-b09c-1c56c63b46a2.m3u8`
      FS.writeFile(path, text)
      const clip = videoTrack.appendClip(path);
    

Preview effect reference Video positioning preview

3.2.3 Video playback and positioning preview

3.2.3.1 Video playback

playbackTimeline(timeline: NvsTimeline, startTime: number, endTime: number, videoSizeMode: number, preload: boolean, flags: number): boolean;

  • Usage examples
      nvStreamingContext.playbackTimeline(
      nvTimeline, // current timeline object
      nowTime, // start time
      endTime, // End time -1 means playing to the end
      NvsVideoPreviewSizeModeEnum.LiveWindowSize, //Video size mode
      preload, // Whether to preload, set to true
      (flags |= NvsPlaybackFlagEnum.BuddyHostOriginVideoFrame) // Playback flag
      );
    
  • Parameters
    • timeline timeline object
    • startTime start time, unit is microsecond, 1/1000000 second.
    • endTime, end time, unit microsecond, 1/1000000 second. The value can be timeline.duration or -1.
    • videoSizeMode reference NvsVideoPreviewSizeModeEnum It is recommended to set it to NvsVideoPreviewSizeModeEnum.LiveWindowSize. If there are no special requirements, setting the mode to NvsVideoPreviewSizeModeEnum.FullSize will affect performance.
    • preload Whether to preload, true means preloading, false means not preloading.
    • flags playback flag NvsPlaybackFlagEnum
Note
The time unit of Meishe SDK is microsecond, 1/1000000 second.

3.2.3.2 Video positioning preview

  • Method seekTimeline
      seekTimeline(timeline: NvsTimeline, timestamp: number, videoSizeMode: number, flags: number): boolean;
    
  • Usage examples
      nvStreamingContext.seekTimeline(
      nvTimeline, // current timeline object
      nowTime, // start time
      NvsVideoPreviewSizeModeEnum.LiveWindowSize, //Video preview size mode
      (flags |= NvsSeekFlagEnum.BuddyHostOriginVideoFrame), // Positioning flag
      );
    
  • Parameters
    • timeline timeline object
    • timestamp start time, unit is microsecond, 1/1000000 second.
    • videoSizeMode reference NvsVideoPreviewSizeModeEnum It is recommended to set it to NvsVideoPreviewSizeModeEnum.LiveWindowSize. If there are no special requirements, setting the mode to NvsVideoPreviewSizeModeEnum.FullSize will affect performance.
    • flags positioning flag NvsSeekFlagEnum
Note
To preview the effects of modifications to the timeline (including adding, removing, sorting, cropping, etc.), this method needs to be called.

3.2.4 Video cropping

  • Change the in and out points of the clip, crop the clip.
      const clip = videoTrack.getClipByIndex(0);
      clip.changeTrimInPoint(trim_in, true);
      clip.changeTrimOutPoint(trim_out, true);
    

3.2.5 Remove segments from tracks

  • Remove fragments:
      videoTrack.removeClip(0, false);
    

3.2.6 Fragment sorting

  • The clips on the track can swap positions. The parameters clipIndex and destClipIndex of moveClip() respectively represent the position index of the two swapped materials.
      videoTrack.moveClip(0,3);
    

3.2.7 Picture duration setting

  • The NvsVideoTrack class provides appendClip2(filePath, trimIn, trimOut) to freely set the picture duration as needed.
  • filePath is the path of the picture material. If trimIn is set to 0 and trimOut is set to 8000000, the picture will be displayed for 8 seconds.
      const assetUrl = https://alieasset.meishesdk.com/test/2023/07/10/image/0c3c43a5-f9f8-4223-84f2-2c35c535f104/0c3c43a5-f9f8-4223-84f2-2c35c535f104.m3u8
      const response = await fetch(assetUrl);
      const text = await response.text();
      const path = `/0c3c43a5-f9f8-4223-84f2-2c35c535f104.m3u8`
      FS.writeFile(path, text)
      videoTrack.appendClip2(path,0,8000000);
    
  • If you add a picture through appendClip(String filePath), the default display time of the picture is 4 seconds.

3.2.8 Remove timeline and track

Create a timeline, add video tracks and audio tracks, and remove them if they are no longer needed. Here's how to do it:

  • Remove timeline:
      nvStreamingContext.removeTimeline(nvTimeline);
    
  • Remove video track:
      nvTimeline.removeVideoTrack(0);
    
  • Remove audio track:
      nvTimeline.removeAudioTrack(0);
    

For preview effect, please refer to Video Positioning Preview

3.3 Music

3.3.1 Add music

  • Adding music to videos is achieved by adding audio clips to the audio track. After creating the timeline, add audio tracks through appendAudioTrack(), and then add music files to the audio tracks in the form of audio clips. You can add multiple pieces of music and the music will play continuously.
      const assetUrl = 'https://alieasset.meishesdk.com/test/2024/05/24/audio/6dc60190-c22b-4740-a299-3981d1a8c7ec/6dc60190-c22b-4740-a299-3981d1a8c7ec.m3u8'
      const audioTrack = nvTimeline.appendAudioTrack();
      const response = await fetch(assetUrl);
      const text = await response.text();
      const path = `/6dc60190-c22b-4740-a299-3981d1a8c7ec.m3u8`
      FS.writeFile(path, text)
      audioTrack.appendClip(path);
    

3.3.2 Music cropping

  • Music cropping is the same as video cropping, and it is also done by setting the entry and exit points.
      const clip = audioTrack.getClipByIndex(0);
      clip.changeTrimInPoint(trim_in, true);
      clip.changeTrimOutPoint(trim_out, true);
    

For preview effect, please refer to Video Positioning Preview

3.4 Subtitles

Adding, deleting and obtaining subtitles are all performed on the timeline. Call addCaption(). You can refer to the addCaption() method of the SdkDemo example.

3.4.1 Adding and deleting subtitles

  • Add subtitles and set the duration for subtitle display.
      const timelineCapion = nvTimeline.addCaption("Meishe SDK", 0, nvTimeline.getDuration(), captionStylePackageId,false);
    

captionStylePackageId is the material style package id. Before adding subtitles, you need to install the subtitle style package. Refer to Material Package Management, and try the subtitle style package Good Species Grass

  • When the subtitle text is in Chinese, the preview is garbled and fonts need to be installed. Font installation:
      const fontUrl = 'https://alieasset.meishesdk.com/font/stationcoolblackbody.ttf'
      const response = await Axios.get(fontUrl, {
      responseType: 'arraybuffer',
      })
      const fontInFS = '/' + fontUrl.split('/').pop()
      await FS.writeFile(fontInFS, new Uint8Array(response.data))
      // Set font for subtitles
      caption.setFontByFilePath(fontInFS)
    
  • Remove subtitles and return the next timeline subtitle object. If there is no next subtitle, returns null.
      const nextCaption = nvTimeline.removeCaption(timelineCapion);
    

3.4.2 Get subtitles on timeline

  • There are many ways to get subtitles:
      //Get the first subtitle on the timeline
      const firstCaption = nvTimeline.getFirstCaption();
    
      //Get the last subtitle on the timeline
      const lastCaption = nvTimeline.getLastCaption();
    
      //Get the previous subtitle of the current subtitle on the timeline
      const prevCaption = nvTimeline.getPrevCaption(currentCaption);
    
      //Get the next subtitle of the current subtitle on the timeline
      const nextCaption = nvTimeline.getNextCaption(currentCaption);
    

Obtain subtitles based on the position on the timeline and return a List collection that saves the subtitles at the current position. The sorting rules of the obtained subtitle list are as follows:

  • When adding subtitles, the entry points are different, and they are arranged in the order of entry points;
  • When adding subtitles, the entry point is the same, and the subtitles are arranged in the order in which they are added.
      const cpationList = nvTimeline.getCaptionsByTimelinePosition(1000000);
    

3.4.3 Change subtitle attributes

Modifying subtitle properties can be achieved through the methods of the NvsTimelineCaption class. After obtaining the subtitles, you can set the subtitle text, color, bold, italics, stroke, etc.

  • Take modifying subtitle text as an example:
      currentCaption.setText("Meishe SDK");
    

3.4.4 Change the entry and exit points of subtitles

  • After obtaining the subtitles, you can modify the in point, out point and offset value of the subtitles on the timeline.
      //Change entry point
      currentCaption.changeInPoint(1000000);
    
      //Change the out point
      currentCaption.changeOutPoint(5000000);
    
      //Change the display position (the in point and the out point are offset by the offset value at the same time)
      currentCaption.movePosition(1000000);
    

For preview effect, please refer to Video Positioning Preview

3.5 Animated stickers

Adding, deleting and obtaining animated stickers are also performed on the timeline. You can refer to the sticker module of the SdkDemo example.

3.5.1 Adding and deleting animated stickers

  • Add animated stickers:
      nvTimeline.addAnimatedSticker(0, nvTimeline.getDuration(), stickerId);
    
    stickerId is the animated sticker package ID. Before adding animated stickers, you need to install the animated sticker package. Refer to Material Package Management, and try out the sticker Cream Cake
  • Remove animated stickers and return to the next sticker of the current sticker. If there is no next sticker, return null.
      const nextSticker = nvTimeline.removeAnimatedSticker(currentSticker);
    

3.5.2 Get animated stickers on timeline

  • There are multiple ways to get animated stickers added to your timeline.
      //Get the first animated sticker on the timeline
      const firstSticker = nvTimeline.getFirstAnimatedSticker();
    
      //Get the last animated sticker on the timeline
      const lastSticker = nvTimeline.getLastAnimatedSticker();
    
      //Get the previous animated sticker of the current animated sticker in the timeline
      const prevSticker = nvTimeline.getPrevAnimatedSticker(currentSticker);
    
      //Get the next animated sticker of the current animated sticker in the timeline
      const nextSticker = nvTimeline.getNextAnimatedSticker(currentSticker);
    

Get the animated sticker based on the position on the timeline and return a List collection that saves the animated sticker object at the current position. The sorting rules of the obtained animated sticker list are as follows:

  • If the entry points are different when adding, they will be arranged in the order of entry points;
  • The entry points are the same when adding, and the animated stickers are arranged in the order in which they are added.
      const stickerList = nvTimeline.getAnimatedStickersByTimelinePosition(1000000);
    

3.5.3 Change the properties of animated stickers

Modifying sticker properties can be achieved through the methods of the NvsTimelineAnimatedSticker class. After obtaining the sticker, you can set the zoom value, horizontal flip, rotation angle, translation, etc.

  • Take modifying sticker scaling as an example:
      currentSticker.setScale(1.2);
    
  • If it is a panoramic animated sticker, you can also set the polar angle of the center point of the sticker, the azimuth angle of the center point, etc. Take setting the polar angle of the center point as an example:
      currentSticker.setCenterPolarAngle(1.2);
    

3.5.4 Change the entry and exit points of animated stickers

  • After obtaining the sticker, you can modify the in point, out point and offset value of the animated sticker on the timeline.
      //Change entry point
      currentSticker.changeInPoint(1000000);
    
      //Change the out point
      currentSticker.changeOutPoint(5000000);
    
      //Change the display position (the in point and the out point are offset by the offset value at the same time)
      currentSticker.movePosition(1000000);
    

For preview effect, please refer to Video Positioning Preview

3.6 Fragment Timeline

The clip timeline is added on the video track. You can refer to the addTimelineClip method of sdkDemo.

  1. First you need to create a timeline
      const timeline = nvStreamingContext.createTimeline(
      new NvsVideoResolution(width, height),
      newNvsRational(25, 1),
      new NvsAudioResolution(44100, 2),
      TIMELINE_FLAGS,
      );
    
  2. Then create the video track
      const videoTrack = timeline.appendVideoTrack();
    
  3. Then create the fragment
      const clip = videoTrack.appendClip(filePath);
    
  4. Finally, you need to add the timeline to the video track of the main timeline
      const timelineClip = videoTrack.addTimelineClip(timeline,0);
    
  • The return value of the addTimelineClip method is an NvsVideoClip object. You can refer to NvsVideoClip. You can use the isTimelineClip method to determine whether it is a timeline clip. The internal timeline can be obtained through the getInternalTimeline method.
      addTimelineClip(timeline: NvsTimeline, inPoint: number): NvsVideoClip;
    

For preview effect, please refer to Video Positioning Preview

3.7 Video transition

Video transitions include inline transitions and wrapped transitions. Set video embedded transition: refer to setBuiltinTransition

  • Supported built-in transitions, refer to Built-in Video Transitions
      videoTrack.setBuiltinTransition(0,transitionName);
    
  • Video transition package: refer to setPackagedTransition
      videoTrack.setPackagedTransition(1, transitionId);
    
    transitionId is the video package transition ID. Before adding a video package transition, you need to install the video package transition. Refer to Material Package Management and try out the transition 3D Rotation 03

For preview effect, please refer to Video Positioning Preview

3.8 Special effects

In subsequent video editing, several special effects are often used, namely video special effects (NvsVideoFx), audio special effects (NvsAudioFx), and timeline video special effects (NvsTimelineVideoFx).

3.8.1 Video special effects

Video special effects are used on video clips, and several video special effects can be added to each video clip. Video special effects include embedded video special effects, wrapped video special effects, and beauty effects.

  • Add embedded video effects:
  • Supported built-in video effects reference Built-in video effects
      videoClip.appendBuiltinFx(fxName);
    
  • Added video effects package:
      videoClip.appendPackagedFx(fxPackageId);
    
    fxPackageId is the video special effects package ID. Before adding the video special effects package, you need to install the video special effects package. Refer to Material Package Management to try out the transition vertical mirror

Removing video special effects includes removing special index value effects and removing all video special effects.

  • Remove the special effect of the specified index value:
      videoClip.removeFx(0);
    
  • Remove all video effects:
      videoClip.removeAllFx();
    

3.8.2 Audio special effects

Audio special effects are used on audio clips, and several audio special effects can be added to each audio clip.

  • Add audio effects:
  • Supported built-in audio effects reference Built-in Audio Effects
      audioClip.appendFx(fxName);
    
  • Remove the audio effects at the specified index:
      audioClip.removeFx(0);
    

3.8.3 Timeline video special effects

time Online video special effects are a kind of special effects used on the timeline, including inline special effects and wrapped special effects. Several timeline video effects can be added to the timeline.

3.8.3.1 Adding and deleting timeline effects

Add timeline effects:

  • Add embedded timeline effects:
  • Supported built-in timeline effects reference Built-in video effects
      nvTimeline.addBuiltinTimelineVideoFx(1000000,5000000,fxName);
    
  • Added video effects package:
      nvTimeline.addPackagedTimelineVideoFx(1000000,5000000,fxPackageId);
    
    fxPackageId is the video special effects package ID. Before adding the timeline special effects package, you need to install the timeline special effects first. Please refer to Material Package Management

3.8.3.2 Get timeline effects

  • There are many ways to obtain timeline effects.
      //Get the first timeline video effects on the timeline
      const firstTimelineFx = nvTimeline.getFirstTimelineVideoFx();
      //Get the last timeline video effects on the timeline
      const lastTimelineFx = nvTimeline.getLastTimelineVideoFx();
      //Get the previous timeline video effects of a certain timeline video effect on the timeline
      const prevTimelineFx = nvTimeline.getPrevTimelineVideoFx(currentTimelineFx);
      //Get the next timeline video effects of a certain timeline video effect on the timeline
      const nextTimelineFx = nvTimeline.getNextTimelineVideoFx(currentTimelineFx);
    

Obtain the timeline video effects based on the position on the timeline and return an array of timeline video effects objects at the current position. The sorting rules of the obtained timeline video special effects array are as follows:

  • If the entry points are different when adding, they will be arranged in the order of entry points;
  • The entry points are the same when adding, and they are arranged in the order in which they are added to the timeline video effects.

    const timelineFxArray = nvTimeline.getTimelineVideoFxByTimelinePosition(2000000);

3.8.3.3 Change the entry and exit points of timeline special effects

  • After obtaining the timeline effects, you can modify the in point, out point and offset value of the timeline effects on the timeline.
      //Change entry point
      currentTimelineFx.changeInPoint(1000000);
      //Change the out point
      currentTimelineFx.changeOutPoint(5000000);
      //Change the display position (the in point and the out point are offset by the offset value at the same time)
      currentTimelineFx.movePosition(1000000);
    
    For preview effect, please refer to Video Positioning Preview

3.9 Material package management

Meishe SDK provides a rich material library, including animated stickers, themes, subtitle styles, transitions, etc. Material packages can be downloaded from the Internet or provided by the Meishe SDK project team. Users can choose to use these material packages according to their needs. Meishe SDK manages these material packages through the NvsAssetPackageManager class. You can install, uninstall, and obtain the status and version number of the material packages, etc. For details, refer to NvsAssetPackageManager

  • Use the installAssetPackage method to install the material package, taking animated stickers as an example:
      import axios from 'axios';
    
      const packageUrl = "https://qasset.meishesdk.com/material/pu/animatedsticker/A1509C3D-7F5C-43CB-96EE-639ED7616BB7/A1509C3D-7F5C-43CB-96EE-639ED7616BB7.1.animatedsticker";
      const response = await axios.get(packageUrl, { responseType: 'arraybuffer' });
      const packageFile = '/' + packageUrl.split('/').pop();
      await FS.writeFile(packageFile, new Uint8Array(response.data));
      let packageLicenseFile = '';
      let packageType = NveAssetPackageTypeEnum.AnimatedSticker;
      const assetPackageManager = nvsGetStreamingContextInstance().getAssetPackageManager();
      assetPackageManager.onFinishAssetPackageInstallation = (assetPackageId, assetPackageFilePath, assetPackageType, error) => {};
      assetPackageManager.installAssetPackage(packageFile, packageLicenseFile, packageType);
    
Note
onFinishAssetPackageInstallation() is the callback function for special effects package installation;
  • The packageType type is NveAssetPackageTypeEnum, and different special effects package resources correspond to different types;
  • The packageFile must keep the same name as the original resource package, and the name cannot be modified at will, otherwise the installation will fail;
  • packageLicenseFile can be set to an empty string when the SDK has not verified the authorization. If the SDK has verified the authorization file, you need to use the authorization file corresponding to each special effects package. This authorization file is unique to each special effects package, and It is not the authorization file of sdk.
  • Uninstall the asset package, method uninstallAssetPackage:
      const error = nvStreamingContext.getAssetPackageManager().uninstallAssetPackage(stickerId,NvsAssetPackageTypeEnum.AnimatedSticker);
    

3.10 callback

Meishe SDK provides many callback interfaces. If you want to query the status of the collection device, collection recording status, video playback status, file generation status, resource package installation status, etc., you must set the callback and implement the corresponding callback method after creating the NvsStreamingContext object. ReferencenvStreamingContext

  • Take the video playback callback onPlaybackTimelinePosition as an example:
      nvStreamingContext.onPlaybackTimelinePosition = (timeline, position) => {
      console.log(`Current playback position ${position}`);
      }
    

3.11 Generate video

For the sake of editing efficiency, the web side does not use the original video for editing, but uses a low-resolution video. To synthesize the video, you need to call the back-end interface to generate the video. You can refer to the exportVideo() method in sdkDemo.

4 attachments

For more functions, please refer to: https://www.meishesdk.com/cloudedit, which will be introduced in detail.