Skip to content

Meishe AR Face Filter——ZEGO


This article explains how to integrate and use the Meishe AR Face Filter plugin in your Web projects.

Precondition

Before you begin, ensure that the following requirements are met:

  • For Windows or macOS computers, the following requirements must be met:
    • PC browser version: ipad and pad devices are not supported.
    • Chrome,Opera,360 browser: the kernel is Chrome, version 75 includes more than support;
    • Firefox browser: The kernel version is greater than 58, including the above support (special cases: version 72 is not supported);
    • Safari browser: minimum 15.4, 17.0 and above the best performance, recommended to the latest version
    • Physical audio and video acquisition equipment.
    • Internet connection possible.
  • Installed Node.js and npm

Integration And Invocation

1. Integrate SDK and plugin

You have created a project in ZEGO Console and applied for a valid AppID, please refer to "Project Information" in Console - Project Management for details. Before you begin, you need to integrate the audio and video SDK and the Meishe web SDK in your project.

tip: The authentication files, template package files, model package files, and packet files in all the following examples can be obtained by contacting meishe Business.

1.1 Integrate ZEGO audio and video SDK

Meishe AR Face Filter needs to be used with ZEGO Web SDK 3.x (v3.0.0 or above). Refer to the following documentation to integrate the Web SDK and implement a basic video call:

[Implementing Video Calling](Web JavaScript 实时音视频 SDK 实现视频通话 - 开发者中心 - ZEGO即构科技)

1.2 Integration with Agora Video SDK

Follow these steps to integrate the plugin:

Use npm to integrate the 'Beautiful Beauty' plugin into your project.

  1. Install the ZEGOSDK and Meishe AR Face Filter by running the following command:

    typescript
    npm i zego-express-engine-webrtc meishewasmloader
  2. Add the following code to your file to import the plugin module:

    typescript
    import ZegoLocalStream from 'zego-express-engine-webrtc/sdk/code/zh/ZegoLocalStream.web';
    import { ZegoExpressEngine } from 'zego-express-engine-webrtc';
    import { WASMLoader } from 'meishewasmloader';

2. Configure the Meishe AR Face Filter environment

Meishe AR Face Filter needs to configure the response header, To make SharedArrayBuffer Available.

2.1 vite environment

Add the following configuration to the plugins configuration in vite.config.ts:

typescript
plugins:[
    {
      name: 'configure-response-headers',
      configureServer: (server) => {
        server.middlewares.use((_req, res, next) => {
          res.setHeader('Cross-Origin-Embedder-Policy', 'require-corp')
          res.setHeader('Cross-Origin-Opener-Policy', 'same-origin')
          res.setHeader('Cross-Origin-Resource-Policy', 'cross-origin')
          next()
        })
      },
      configurePreviewServer: (server) => {
        server.middlewares.use((_req, res, next) => {
          res.setHeader('Cross-Origin-Embedder-Policy', 'require-corp')
          res.setHeader('Cross-Origin-Opener-Policy', 'same-origin')
          res.setHeader('Cross-Origin-Resource-Policy', 'cross-origin')
          next()
        })
      },
    },
    ...
]

2.2 webpack environment

Add the following configuration to plugins in webpack.config.ts:

typescript
devServer:{
	headers: {
      'Cross-Origin-Opener-Policy':'same-origin',
      'Cross-Origin-Embedder-Policy':'require-corp',
      'Cross-Origin-Resource-Policy': 'cross-origin'
    },
    ...
}

The purpose of the above configuration is to add the above setting to the request header to instruct the browser to enable SharedArrayBuffer

The above syntax is affected by the version of webpack and vite, and the API can be configured for each version.

3. init Meishe AR Face Filter

Refer to the following steps for initial Meishe AR Face Filter:

  1. Import the API file NvEffectSdk.js

    html
    <script src="xxx/meisheSDK-CDN-Url"></script>
  2. WASMLoader is used to load wasm related files.

    js
    const wasmLoader = WASMLoader({
        // Load the progress callback
        showLoader: function (state) {},
        // Failure callback
        showError(errorText: string) {},
        // Success callback
        loadingFinished() {},
    });
    wasmLoader.loadEmscriptenModule("xxx/meisheSDK-CDN-Url", { effectSdk:true });
    //The addresses in steps 1 and 2 need to be consistent, otherwise it will lead to unexpected results.
  3. 授权验证

    MeiShe web EffectSDK In order to facilitate customer access testing, for the way of localhost launch, all effects can be used without authorization verification. When customers deploy in a non-localhost environment, they need to contact MeiShe business personnel to obtain sdk authorization files. Then perform authorization verification as follows.Wasm file load success callback function loadingFinished call verifySdkLicenseFileUrl()for authorization validation:

    if(nveGetEffectContextInstance().verifySdkLicenseFileUrl('xxx.lic')) {
        // Successful authorization
    } else {
        // authorization failed
    }

4. init ZEGO Engine

Fetching creates the ZegoExpressEngine instance. Refer to "Project Information" in Console - Project Management for appId and server details.

typescript
const zg = new ZegoExpressEngine(appId, server);

5. Initialize the ARScene shortcut renderer

The purpose of initializing the ARScene shortcut renderer is to load the sdk and a series of model packages, data packages. In order to realize the recognition/loading of beauty, makeup, face, avatar, portrait, stickers, subtitles, background segmentation, special effects props and other functions.

具体配置如下:

typescript
import jszip from 'jszip';
let arSceneRenderer = new NveARSceneRenderer();
// arSceneRenderer is the Meishe AR Face Filter instance and needs to be saved
arSceneRenderer.init({
  faceModelUrl: 'https://xxx',
  eyecontourModelUrl: 'https://xxx',
  avatarModelUrl: 'https://xxx',
  segmentationModelUrl: 'https://xxx',
  makeupDataUrl: 'https://xxx',
  fakefaceDataUrl: 'https://xxx',
  faceCommonDataUrl: 'https://xxx',
  advancedBeautyDataUrl: 'https://xxx',
  detectionMode: 32768 | 32,//sdkFlag,
  ratio: width + ':' + height,
  mirror: true,
  jszip,
})

The type of arSceneRenderer returned after initializing the Meishe AR Face Filter plug-in is NveARSceneRenderer, and various attribute methods can be found from the type document.

6. Open Meishe AR Face Filter plugin special effects

Follow these steps to enable special effects:

  1. Call browser navigator. MediaDevices. GetUserMedia method open cameras, to obtain the initial flow:

    typescript
    const stream = navigator.mediaDevices.getUserMedia()
  2. Call the SDK's pushMediaStream method to send the stream to the SDK:

    typescript
    arSceneRenderer.pushMediaStream(stream);
  3. Call the SDK's getOutputStream method to get the processed stream:

    typescript
    const sdkStream = arSceneRenderer.getOutputStream();
  4. Call the shortcut renderer's enableBeauty method to enable basic beauty:

    typescript
    arSceneRenderer.enableBeauty(true);

7.Connect the ZEGO SDK

  1. Take the video stream obtained in step 6 and 3 from the Beautiful Photo SDK as a third party stream, and call createZegoStream to create an ZEGO video stream.

    typescript
    const zegoStream = await zg.createZegoStream({
      custom: {
        video: {
          source: sdkStream as MediaStream,
        }
      }
    });
  2. Log in to the room by calling ZEGOAPI loginRoom ,Related process and parameters can be seen on the official website of the construction of audio and video call

    typescript
    zg.loginRoom(
        roomid, 
        token, 
        { userID: userid, userName:nickName }, 
        { userUpdate: true },
    );
  3. Use startPublishingStream to push the stream

    typescript
    let streamID = new Date().getTime().toString();
    zg.startPublishingStream(streamID, zegoStream, { videoCodec: 'H264' });
  4. Preview the streaming video locally

    localStream.playCaptureVideo(document.getElementById('loacl-preview'), {
        objectFit: 'cover',
    });

sample code

下面列出了一段实现插件功能的代码以供参考。

  1. vite.config.ts config

    typescript
    export default defineConfig({
      plugins: [
        react(),
        {
          name: 'configure-response-headers',
          configureServer: (server) => {
            server.middlewares.use((_req, res, next) => {
              res.setHeader('Cross-Origin-Embedder-Policy', 'require-corp')
              res.setHeader('Cross-Origin-Opener-Policy', 'same-origin')
              res.setHeader('Cross-Origin-Resource-Policy', 'cross-origin')
              next()
            })
          },
          configurePreviewServer: (server) => {
            server.middlewares.use((_req, res, next) => {
              res.setHeader('Cross-Origin-Embedder-Policy', 'require-corp')
              res.setHeader('Cross-Origin-Opener-Policy', 'same-origin')
              res.setHeader('Cross-Origin-Resource-Policy', 'cross-origin')
              next()
            })
          },
        },
      ],
    })
  2. Functional code

    typescript
    import jszip from 'jszip';
    import { WASMLoader } from 'meishewasmloader';
    import { ZegoExpressEngine } from 'zego-express-engine-webrtc';
    
    const wasmLoader = WASMLoader({
        // Load the progress callback     
        showLoader: function (state) {},
        // Failure callback
        showError(errorText: string) {},
        // Success callback
        loadingFinished() {},
    });
    wasmLoader.loadEmscriptenModule("xxx/meishe-SDK-CDN-url", { effectSdk:true });
    
    let arSceneRenderer = new NveARSceneRenderer();
    await arSceneRenderer.init({
      faceModelUrl: 'https://xxx',
      eyecontourModelUrl: 'https://xxx',
      avatarModelUrl: 'https://xxx',
      segmentationModelUrl: 'https://xxx',
      makeupDataUrl: 'https://xxx',
      fakefaceDataUrl: 'https://xxx',
      faceCommonDataUrl: 'https://xxx',
      advancedBeautyDataUrl: 'https://xxx',
      detectionMode: 32768 | 32,
      ratio: width + ':' + height,
      sdkCDNUrl: 'https://xxx',
      licFileUrl: 'xxx.lic',
      mirror: true
      jszip
    })
    
    // Initialize the ZEGO engine
    const zg = new ZegoExpressEngine(appId, server);
    
    // Call the browser API to start the camera to get the audio and video stream
    const stream = await navigator.mediaDevices.getUserMedia();
    
    // Pass it to MeiSheSDK
    arSceneRenderer.pushMediaStream(stream);
    
    // Obtain the processed audio and video streams for use as third-party streams
    let sdkStream = arSceneRenderer.getOutputStream();
    
    // Enable basic beauty function
    arSceneRenderer.enableBeauty(true);
    
    // Call createZegoStream to create a ZEGO video stream
    const zegoStream = await zg.createZegoStream({
      custom: {
        video: {
          source: sdkStream as MediaStream,
        }
      },
    });
    
    // Call loginRoom to login to the room
    await zg.loginRoom(
        roomid,
        token,
        { userID: userid, userName: nickName },
        { userUpdate: true },
    );
    
    // Push stream Id, need to be unique
    let streamID = new Date().getTime().toString();
    
    // Call startPublishingStream to push the stream
    zg.startPublishingStream(streamID, zegoStream, { videoCodec: 'H264' });
    
    // Local preview
    zegoStream.playCaptureVideo(document.getElementById('local-preview'), {
      objectFit: 'cover',
    });
    
    // Set the beauty
    const effectList = [
      {
        key: "Advanced Beauty Intensity",
        intensity: 1,
      },
      {
        key: "Beauty Whitening",
        intensity: 1,
      },
    ];
    
    // Add beauty
    effectList.push([
      {
        url: "https://xxx.makeup",
        licUrl: "xxxx",
        intensity: 1,
      }
    ])
    
    // Add filters
    effectList.push([
      {
        url: "https://xxxx",
        licUrl: "xxxx", 
        intensity: 1, 
      }
    ])
    
    // Apply filters and effects
    arSceneRenderer.setEffectList(effectList);

api reference

Meishe AR Face Filter API

NveARSceneRenderer

typescript
new NveARSceneRenderer();

Create the SDK shortcut renderer instance.

init

typescript
arSceneRenderer.init()

Initialize the SDK shortcut renderer and return the instance, Plug-in instance class for NveARSceneRenderer The init method is under the NveARSceneRenderer class.

enableBeauty

typescript
enableBeauty()

Turn on/off beauty effects to control only the built-in beauty effects. This method is located under the NveARSceneRenderer plug-in instance.

setEffectList

typescript
setEffectList()

Set the list of effects. You can set beauty, beauty makeup, beauty type, filter, props, background. This method is located under the NveARSceneRenderer plug-in instance.

getEffectList

typescript
getEffectList()

Gets a list of all the effects that have been set, including added beauty, filters, props, and more. This method is located under the NveARSceneRenderer plug-in instance.

createExternalEffectInstance

typescript
createExternalEffectInstance()

Create extended effects instances for creating captions and stickers instances. This method is located under the NveARSceneRenderer plug-in instance.

appendExternalEffectInstance

typescript
appendExternalEffectInstance()

Add an extended effects instance to the plugin instance. This method is located under the NveARSceneRenderer plug-in instance.

setExternalEffectInstanceList

typescript
setExternalEffectInstanceList()

Gets a list of all added extension effects. This method is located under the NveARSceneRenderer plug-in instance.

Special effect setting

Beauty

  1. Built-in effects

    Built-in special effects do not need special effects package, which belongs to the built-in ability of the smart beauty effects plug-in, but you need to know the key name. The specific content can be viewed in the Meishu SDK Web document, which is roughly as follows:

    typescript
    // key is the built-in special effects keyword used to distinguish special effects functions
    // intensity indicates the specific intensity. The value ranges from [-1,1] to [0,1]
    // sdk built-in hundreds of built-in effects, range and key values can be found by yourself
    const beautyArray = [
    	{
    		key:'Advanced Beauty Intensity',
    		intensity:1
    	},
        {
    		key:'Beauty Whitening',
    		intensity:1
    	},
        {
    		key:'Beauty Reddening',
    		intensity:1
    	},
        {
    		key:'Face Mesh Face Width Degree',
    		intensity:1
    	},
        {
    		key:'Face Mesh Face Length Degree',
    		intensity:1
    	},
        {
    		key:'Face Mesh Face Size Degree',
    		intensity:1
    	},
        {
    		key:'Face Mesh Forehead Height Degree',
    		intensity:1
    	},
        ...
    ]
    
    arSceneRenderer.setEffectList(beautyArray)
  2. Beauty template

    Unlike built-in effects, beauty templates need to be passed into the template package, not the key value. The code is as follows:

    typescript
    const templateArray = [
        {
            url:"https://xxxx",
            licUrl: "xxxx",
            intensity:1,
        },
        {
            url:"https://xxxx",
            licUrl: "xxxx",
            intensity:1,
        },
    ]
    arSceneRenderer.setEffectList(templateArray)

Makeup

  1. normal package

    Ordinary beauty bags have lipstick, eyeshadow, eyebrows, eyelashes, eyeliner, blush, shine, shadow, contact lenses, makeup and a series of categories. The configuration method is the same as the beauty template, and the code is as follows:

    typescript
    const makeupArray = [
        {
            url:"https://xxx.makeup",
            licUrl: "xxxx",
            intensity:1,
        },
        {
            url:"https://xxx.makeup",
            licUrl: "xxxx",
            intensity:1,
        },
    ]
    
    arSceneRenderer.setEffectList(makeupArray)
  2. Complete package

    The whole bag is the integration of a series of ordinary bags. The configuration method is as follows:

    typescript
    arSceneRenderer.setEffectList([
        {
            url:"https://xxx.zip",
        },
    ])

Filter

Filters are configured in the same way as beauty bags, and the code is as follows:

typescript
const filterArray = [
    {
        url:"https://xxxx",
        licUrl: "xxxx",
        intensity:1,
    }
]
arSceneRenderer.setEffectList(filterArray)

Virtual background

  1. background blur

    You need to specify the background blur special effects package, as follows:

    typescript
    arSceneRenderer.setEffectList([
        {
            url:"https://xxx.videofx",
            licUrl:''
        }
    ])
  2. Background substitution

    Background The replacement configuration is as follows:

    typescript
    arSceneRenderer.setEffectList([
        {segmentationBackgroundUrl:"https://xxx.png"}//image url
    ])

AR scene

The configuration method is the same as the filter, and the code is as follows:

typescript
arSceneRenderer.setEffectList([
    { 
    	url: "https://xxx.arscene",
    	licUrl: '' 
    }
])

Caption

1 common caption

Common subtitling categories include flower characters, dynamic subtitling, word-for-word subtitling, bubble subtitling and so on. The configuration method is not the same as beauty makeup.

createExternalEffectInstance method parameter information table:

ParametersTypeRequiredDefaultExplanation
modularBooleanNofalseWhether it is module captioning, the module captioning type can add effects such as bubbles and animations.
textStringYestext
inPointNumberYesbegin time.The unit is microsecond
durationNumberYesThe unit is microsecond,set to Number.MAX_SAFE_INTEGER indicates continuous display.
urlStringNocommon caption package address
licUrlStringNoAddress of the ordinary package authorization file
fontFileUrlStringNoIf the font file address is not set, the non-English text will be displayed incorrectly because the corresponding font cannot be found.
captionRendererUrlStringNoFloral effect style package address.
captionRendererLicUrlStringNoThe address of the license file corresponding to the flower effect style package.
captionContextUrlStringNoBubble Effect style package address.
captionContextLicUrlStringNoBubble effect style package address.
captionAnimationUrlStringNoAnimation effects style package address.
captionAnimationLicUrlStringNoLicense file address corresponding to the animation effects style package.
animationPeriodNumberNoLicense file address corresponding to the animation effects style package. Animation effect period, in milliseconds.
captionInAnimationUrlStringNoEnter the license file address corresponding to the animation effects style package.
captionInAnimationLicUrlStringNoEnter animation style package address.
inAnimationDurationNumberNoEnter the duration of the animation effect in milliseconds.
captionOutAnimationUrlStringNoCreate animation style package address.
captionOutAnimationLicUrlStringNoThe address of the license file corresponding to the animation style package.
outAnimationDurationNumberNoDuration of the animation, in milliseconds.
positionXNumberNoHorizontal position, normalized value, range [-1, 1], center 0, positive direction to the right.
positionYNumberNoVertical position, normalized value, range [-1, 1], center is 0, upward is positive direction.
positionStringNoFor orientation information, the values that can be set include top-left, top, top-right, left, center, right, bottom-left, bottom, and bottom-right.
  1. add

    Subtitle creation method for createExternalEffectInstance, subtitles add methods for appendExternalEffectInstance.

    In the following example, the various subtitle effects are created using the same API with different parameters. Refer to the above parameter table for specific parameters

    typescript
    // common caption
    const caption = await arSceneRenderer.createExternalEffectInstance(
        {
            text: "caption",
            inPoint: 0,
            duration: 5000000,
        }
    );
    
    // Create regular subtitles with subtitles style
    const caption = await arSceneRenderer.createExternalEffectInstance(
        {
            text: "caption",
            inPoint: 0,
            duration: 5000000,
            url: "https://xxx.captionstyle",
            licUrl: "",
        }
    );
    
    // Create regular captions with captioning styles and a splash effect
    const caption = await arSceneRenderer.createExternalEffectInstance(
        {
            text: "caption",
            inPoint: 0,
            duration: 5000000,
            url: "https://xxx.captionstyle",
            licUrl: "",
            captionRendererUrl: "https://xxx.captionrenderer",
            captionRendererLicUrl: "",
        }
    );
    
    // Create regular captions with location information
    const caption = await arSceneRenderer.createExternalEffectInstance(
        {
            text: "caption",
            inPoint: 0,
            duration: 5000000,
            positionX: -0.6,
            positionY: 0.9,
        }
    );
    
    // Create regular captions with orientation information
    const caption = await arSceneRenderer.createExternalEffectInstance(
        {
            text: "caption",
            inPoint: 0,
            duration: 5000000,
            position: "top-right",
        }
    );
    
    // Create plain subtitles with fonts
    const caption = await arSceneRenderer.createExternalEffectInstance(
        {
            text: "caption",
            inPoint: 0,
            duration: 5000000,
            fontFileUrl: "https://xxx.ttf",
        }
    );
    
    // Create unstyled module captions
    const caption = await arSceneRenderer.createExternalEffectInstance(
        {
            modular: true,
            text: "modularCaption",
            inPoint: 0,
            duration: 5000000,
        }
    );
    
    // Create module captions with a splash effect
    const caption = await arSceneRenderer.createExternalEffectInstance(
        {
            modular: true,
            text: "modularCaption",
            inPoint: 0,
            duration: 5000000,
            captionRendererUrl: "https://xxx.captionrenderer",
            captionRendererLicUrl: "",
        }
    );
    
    // Create a module caption with a bubble effect
    const caption = await arSceneRenderer.createExternalEffectInstance(
        {
            modular: true,
            text: "modularCaption",
            inPoint: 0,
            duration: 5000000,
            captionContextUrl: "https://xxx.captioncontext",
            captionContextLicUrl: "",
        }
    );
    
    // Create module captions with animated effects
    const caption = await arSceneRenderer.createExternalEffectInstance(
        {
            modular: true,
            text: "modularCaption",
            inPoint: 0,
            duration: 5000000,
            captionAnimationUrl: "https://xxx.captionanimation",
            captionAnimationLicUrl: "",
            animationPeriod: 2500,
        }
    );
    
    // Use the interface of the caption instance to modify the other attributes of the caption
    caption.scaleCaption2(0.8);
    caption.setUnderline(true);
    
    // Adding a caption instance
    arSceneRenderer.appendExternalEffectInstance(caption);

    The normal subtitle instance type is NveCaption, and various attributes and methods are configured. You can refer to the linked document.

  2. get

    To obtain a list of current setting all extension effects, use getExternalEffectInstanceList method. The code is as follows:

    typescript
    // Get a list of all expanded effects.Note that the expanded effects are not just captions.There may be other effects that need to be filtered
    // You can also maintain your own list of already set subtitle instances
    let instanceList = arSceneRenderer.getExternalEffectInstanceList();
    let captionInstanceList = instanceList.filter(instance=>instance instanceof NveCaption)
  3. delete

    Through getExternalEffectInstanceList access list, find the same as the current subtitle instance id item and delete. Through setExternalEffectInstanceList resetting list. The code is as follows:

    typescript
    let deleteInstanceId = 'xxxx'// The id of the subtitle instance to remove
    let instanceList = arSceneRenderer.getExternalEffectInstanceList()// Get a list of extended effects on deletion without filtering
    let index = instanceList.findIndex(instance=>instance.id!==deleteInstanceId)// Gets the index of the deleted child
    if(index!==-1){
        instanceList[index].release()// release resource
        instanceList.splice(index,1)
        arSceneRenderer.setExternalEffectInstanceList(instanceList);// Resets the list of extended effects
    }
2 Compound caption

createExternalEffectInstance parameters listed in reference ordinary subtitles form.

The code is as follows:

typescript
const compoundCaption = await arSceneRenderer.createExternalEffectInstance(
    {
        inPoint: 0,
        duration: 5000000,
        url: "https://xxx.compoundcaption",
        licUrl: "",
    }
);

// Create compound captions with location information
const compoundCaption = await arSceneRenderer.createExternalEffectInstance(
    {
        inPoint: 0,
        duration: 5000000,
        url: "https://xxx.compoundcaption",
        licUrl: "",
        positionX: -0.6,
        positionY: 0.9,
    }
);

// Create a composite caption with location information Create a composite caption with orientation location information
const compoundCaption = await arSceneRenderer.createExternalEffectInstance(
    {
        inPoint: 0,
        duration: 5000000,
        url: "https://xxx.compoundcaption",
        licUrl: "",
        position: "top-right",
    }
);

// Modify other properties using the interface of the composite caption instance
compoundCaption.scaleCaption2(0.8);
compoundCaption.setText(0, "caption0");

// Add a composite caption instance
arSceneRenderer.appendExternalEffectInstance(compoundCaption);

The instance type of the combined caption is NveCompoundCaption, which can be configured in the same way as normal captions

Animated Sticker

Stickers are created in a similar way to captions, but the parameters are somewhat different.The table of parameters is as follows:

ParametersTypeRequiredDefaultExplanation
inPointNumberYesStart time of the animated sticker in microseconds.
durationNumberYesDuration of the animated sticker in microseconds, set to Number.MAX_SAFE_INTEGER to indicate continuous display.
urlStringNoAnimated sticker pack address.
licUrlStringNoThe address of the license file corresponding to the animation sticker package.
animatedStickerAnimationUrlStringNoAnimation style package address.
animatedStickerAnimationLicUrlStringNoThe license file address for the animation style pack.
animationPeriodNumberNoAnimation effect period in milliseconds.
animatedStickerInAnimationUrlStringNoEnter animation style package address.
animatedStickerInAnimationLicUrlStringNoEnter the animation style package corresponding to the license file address.
inAnimationDurationNumberNoEnter the duration of the animation effect in milliseconds.
animatedStickerOutAnimationUrlStringNoCreate animation style package address.
animatedStickerOutAnimationLicUrlStringNoThe address of the license file corresponding to the animation style package.
outAnimationDurationNumberNoDuration of the animation, in milliseconds.
positionXNumberNoHorizontal position, normalized value, range [-1, 1], center 0, positive direction to the right.
positionYNumberNoVertical position, normalized value, range [-1, 1], center is 0, upward is positive direction.
positionStringNoFor orientation information, the values that can be set include top-left, top, top-right, left, center, right, bottom-left, bottom, and bottom-right.

The code is as follows:

typescript
// Create stickers that don't animate
const sticker = await arSceneRenderer.createExternalEffectInstance(
    {
        inPoint: 0,
        duration: 5000000,
        url: "https://xxx.animatedsticker",
        licUrl: "",
    }
);

// Create animated stickers
const sticker = await arSceneRenderer.createExternalEffectInstance(
    {
        inPoint: 0,
        duration: 5000000,
        url: "https://xxx.animatedsticker",
        licUrl: "",
        animatedStickerAnimationUrl: "https://xxx.animatedstickeranimation",
        animatedStickerAnimationLicUrl: "",
        animationPeriod: 5000,
    }
);

// Create stickers with location information
const sticker = await arSceneRenderer.createExternalEffectInstance(
    {
        inPoint: 0,
        duration: 5000000,
        url: "https://xxx.animatedsticker",
        licUrl: "",
        positionX: -0.6,
        positionY: 0.9,
    }
);

// Create stickers with orientation information
const sticker = await arSceneRenderer.createExternalEffectInstance(
    {
        inPoint: 0,
        duration: 5000000,
        url: "https://xxx.animatedsticker",
        licUrl: "",
        position: "top-right",
    }
);

// Modify other properties using the interface of the animated sticker instance
sticker.scaleAnimatedSticker2(0.5);

// Add an instance of animated stickers
arSceneRenderer.appendExternalEffectInstance(sticker);

The instance type of the sticker is NveAnimatedSticker, which can be operated and configured according to the idea of subtitles.