美摄Web端SDK  3.10.2
effectSDK详细开发介绍和样例代码

1 概述

1.1 注意事项

  1. 运行环境如下:Chrome、FireFox、Edge、Safari等主流的浏览器几乎都支持,
  2. 美摄sdk使用了SharedArrayBuffer技术, 对于网站的开发环境和部署环境,需要添加下面两个标头,以实现网站的跨源隔离
      Cross-Origin-Opener-Policy: same-origin
      Cross-Origin-Embedder-Policy: require-corp
    
    以开发环境为例
      // vite.config.js
      export default defineConfig({
          plugins: [
              ...,
              {
                  name: "configure-response-headers",
                  configureServer: (server) => {
                      server.middlewares.use((_req, res, next) => {
                      res.setHeader("Cross-Origin-Embedder-Policy", "require-corp");
                      res.setHeader("Cross-Origin-Opener-Policy", "same-origin");
                      next();
                      });
                  },
              },
          ],
          ...
      })
    
  3. 对于正式部署环境,还必须使用安全的上下文以支持SharedArrayBuffer技术

2 快速接入

2.1 导入库文件

  1. 引入美摄SDK接口文件NvEffectSdk.js
     <script src="https://alieasset.meishesdk.com/NvWasm/domain/13/NvEffectSdk.js"></script>
    
  2. 使用美摄提供的加载工具加载wasm相关文件
  • (a)安装meishewasmloader 参考
      npm i meishewasmloader --save
    
  • (a)使用WASMLoader加载wasm相关文件
      import { WASMLoader }  from 'meishewasmloader';
      const wasmLoader = WASMLoader({
          // 加载进度回调       
          showLoader: function (state) {},
          // 失败回调
          showError(errorText: string) {},
          // 成功回调
          loadingFinished() {},
      });
      wasmLoader.loadEmscriptenModule("https://alieasset.meishesdk.com/NvWasm/domain/13/", {effectSdk:true});
    
注解
https://alieasset.meishesdk.com/NvWasm/domain/13/ 地址为美摄提供的测试公用地址,如果客户产品准备发布上线,需要联系商务获取sdk相关文件,私有化部署到客户的环境中,替换此地址再使用;
私有化部署wasm相关文件时,要对wasm文件配置brotli或者gzip进行压缩,有助于提升加载速度;
步骤1和2中的地址需要保持一致,否则会导致不可预期的结果;

2.2 授权验证

美摄web端Effectsdk为了便于客户接入测试,针对localhost启动的方式,不需要进行授权验证就可以使用所有效果,

  • 客户使用非localhost环境部署时,需要联系商务获取sdk授权文件,再按照如下方式进行授权验证即可。
  • wasm文件加载成功的回调函数loadingFinished中调用verifySdkLicenseFile()进行授权验证:
      import axios from 'axios';
    
      const response = await axios.get('/static/effectsdk.lic', { responseType: 'arraybuffer' });
      FS.writeFile('/effectsdk.lic', new Uint8Array(response.data));
      if(nveGetEffectContextInstance().verifySdkLicenseFile('/effectsdk.lic')) {
          // 授权成功
      } else {
          // 授权失败
      }
    

授权验证失败则无法使用sdk的所有效果。

注解
FS是wasm的文件系统,所有sdk需要的文件路径都需要在FS的系统内。 FS.mkdir()创建目录;FS.writeFile()写入文件;更多方法请参考

2.3 初始化模型文件

  • 下载并初始化人脸模型
      import axios from 'axios';
    
      const faceModelUrl = "https://alieasset.meishesdk.com/model/ms_face240_v2.0.8.model";
      const response = await axios.get(faceModelUrl, { responseType: 'arraybuffer' });
      const faceModel = '/' + faceModelUrl.split('/').pop();
      await FS.writeFile(faceModel, new Uint8Array(response.data));
      nveGetEffectContextInstance().initHumanDetection(
          faceModel, 
          "", 
          NveHumanDetectionFeatureEnum.FaceLandmark | NveHumanDetectionFeatureEnum.FaceAction 
          | NveHumanDetectionFeatureEnum.SemiImageMode | NveHumanDetectionFeatureEnum.MultiThread
      );
    
注解
如果只处理图片,则将NveHumanDetectionFeatureEnum.SemiImageMode替换成NveHumanDetectionFeatureEnum.ImageMode即可。
  • 下载并初始化眼球模型
      import axios from 'axios';
    
      const eyeballModelUrl = "https://alieasset.meishesdk.com/model/ms_eyecontour_v1.0.2.model";
      const response = await axios.get(eyeballModelUrl, { responseType: 'arraybuffer' });
      const eyeballModel = '/' + eyeballModelUrl.split('/').pop();
      await FS.writeFile(eyeballModel, new Uint8Array(response.data));
      nveGetEffectContextInstance().initHumanDetectionExt(
          eyeballModel, 
          "", 
          NveHumanDetectionFeatureEnum.EyeballLandmark | NveHumanDetectionFeatureEnum.SemiImageMode 
          | NveHumanDetectionFeatureEnum.MultiThread
      );
    
注解
如果只处理图片,则将NveHumanDetectionFeatureEnum.SemiImageMode替换成NveHumanDetectionFeatureEnum.ImageMode即可。
initHumanDetectionExt仅用于initHumanDetection初始化之后才会生效,如果只初始化一个模型,则需要优先使用initHumanDetection,如果初始化多个模型,人脸模型要优先初始化。
  • 下载并初始化人体分割模型
      import axios from 'axios';
    
      const segmentationModelUrl = "https://alieasset.meishesdk.com/model/ms_humanseg_v1.0.17.model";
      const response = await axios.get(segmentationModelUrl, { responseType: 'arraybuffer' });
      const segmentationModel = '/' + segmentationModelUrl.split('/').pop();
      await FS.writeFile(segmentationModel, new Uint8Array(response.data));
      nveGetEffectContextInstance().initHumanDetectionExt(
          segmentationModel,
          "", 
          NveHumanDetectionFeatureEnum.SegmentationBackground
      );
    
注解
initHumanDetectionExt仅用于initHumanDetection初始化之后才会生效,如果只初始化一个模型,则需要优先使用initHumanDetection,如果初始化多个模型,人脸模型要优先初始化。

2.4 设置数据包资源

  • 方法setupHumanDetectionData
      import axios from 'axios';
    
      const faceDataUrl = "https://alieasset.meishesdk.com/model/makeup2_240_v2.1.2.dat";
      const response = await axios.get(faceDataUrl, { responseType: 'arraybuffer' });
      const faceData = '/' + faceDataUrl.split('/').pop();
      await FS.writeFile(faceData, new Uint8Array(response.data));
      nveGetEffectContextInstance().setupHumanDetectionData(NveHumanDetectionDataTypeEnum.Makeup2, faceData);
    

2.5 安装特效包资源

  • 以美妆特效包为例
      import axios from 'axios';
    
      const packageUrl = "https://qasset.meishesdk.com/material/pu/makeup/FDFB2239-44D1-491D-85A1-7A537860118E/FDFB2239-44D1-491D-85A1-7A537860118E.1.makeup";
      const response = await axios.get(packageUrl, { responseType: 'arraybuffer' });
      const packageFile = '/' + packageUrl.split('/').pop();
      await FS.writeFile(packageFile, new Uint8Array(response.data));
      let packageLicenseFile = '';
      let packageType = NveAssetPackageTypeEnum.Makeup;
      const assetPackageManager = nveGetEffectContextInstance().getAssetPackageManager();
      assetPackageManager.onFinishAssetPackageInstallation = (assetPackageId, assetPackageFilePath, assetPackageType, error) => {};
      assetPackageManager.installAssetPackage(packageFile, packageLicenseFile, packageType);
    
注解
onFinishAssetPackageInstallation()是特效包安装的回调函数;
  • packageType类型是NveAssetPackageTypeEnum,不同的特效包资源对应不同的类型;
  • packageFile要保持和原始资源包同名,不可以随意修改名称,否则安装会失败;
  • packageLicenseFile在sdk没有校验授权时,可以设置为空字符串,如果sdk已经校验授权文件,就需要使用每一个特效包对应的授权文件,此授权文件是每一个特效包独有的,并不是sdk的授权文件。

2.6 创建特效实例和渲染

  • 参考方法renderEffects()
      const ratio = new NveRational(16, 9);
      const effectInstance = nveGetEffectContextInstance().createVideoEffect('Pinky', ratio, true);
      const result = await nveGetEffectContextInstance().renderEffects([effectInstance],
                                                                      inputImageData,
                                                                      Math.floor(performance.now(),
                                                                      NveRenderFlagEnum.OutputImageBitmap);
    
注解
inputImageData是输入图像数据,格式为VideoFrame或者ImageData, 返回值result.imageBitmap是输入图像数据,格式为ImageBitmap

3 样例

  • 完整代码如下
      import axios from 'axios';
    
      async installAsset(packageUrl, packageLicenseUrl) {
          let response = await axios.get(packageUrl, { responseType: 'arraybuffer' });
          const packageFile = '/' + packageUrl.split('/').pop();
          const assetUuid = packageUrl.split('/').pop().split('.')[0];
          const suffix = packageFile.split('.').pop();
          await FS.writeFile(packageFile, new Uint8Array(response.data));
    
          let packageLicenseFile = '';
          if (packageLicenseUrl) {
              response = await axios.get(packageLicenseUrl, { responseType: 'arraybuffer' });
              packageLicenseFile = '/' + packageLicenseUrl.split('/').pop();
              await FS.writeFile(packageLicenseFile, new Uint8Array(response.data));
          }
    
          let assetType
          if (suffix === 'videofx') {
              assetType = NveAssetPackageTypeEnum.VideoFx;
          } else if (suffix === 'captionstyle') {
              assetType = NveAssetPackageTypeEnum.CaptionStyle;
          } else if (suffix === 'animatedsticker') {
              assetType = NveAssetPackageTypeEnum.AnimatedSticker;
          } else if (suffix === 'videotransition') {
              assetType = NveAssetPackageTypeEnum.VideoTransition;
          } else if (suffix === 'makeup') {
              assetType = NveAssetPackageTypeEnum.Makeup;
          } else if (suffix === 'facemesh') {
              assetType = NveAssetPackageTypeEnum.FaceMesh;
          }
          return new Promise((resolve, reject) => {
              if (!assetType) {
                  reject(assetUuid);
                  return;
              }
    
              const assetPackageManager = nveGetEffectContextInstance().getAssetPackageManager();
              // 检查包状态,如果已经安装,则不需要再次安装
              const status = assetPackageManager.getAssetPackageStatus(assetUuid, assetType);
              if (status !== NveAssetPackageStatusEnum.NotInstalled) {
                  resolve(assetUuid);
                  return;
              }
    
              assetPackageManager.onFinishAssetPackageInstallation = (assetPackageId, assetPackageFilePath, assetPackageType, error) => {
                  FS.unlink(assetPackageFilePath, 0);
                  if (error === 0 && assetPackageId === assetUuid) {
                      resolve(assetUuid);
                  } else {
                      reject(assetUuid);
                  }
              }
              assetPackageManager.installAssetPackage(packageFile, packageLicenseFile, assetType);
          })
      }
    
      const ratio = new NveRational(16, 9);
      const effectInstance = nveGetEffectContextInstance().createVideoEffect('AR Scene', ratio, true);
      // 美颜
      effectInstance.setIntVal("Advanced Beauty Type", 0);
      effectInstance.setBooleanVal("Beauty Effect", true);
      effectInstance.setBooleanVal("Advanced Beauty Enable", true);
      effectInstance.setFloatVal("Advanced Beauty Intensity", 1);
      // 美白
      const lutUrl = '/static/test.mslut';
      const response = await axios.get(lutUrl, { responseType: 'arraybuffer' });
      const lutFile = '/' + lutUrl.split('/').pop();
      await FS.writeFile(lutFile, new Uint8Array(response.data));
      effectInstance.setBooleanVal("Whitening Lut Enabled", true);
      effectInstance.setStringVal("Whitening Lut File", lutFile);
      // 美型
      effectInstance.setBooleanVal("Face Mesh Internal Enabled", true);
      const faceMeshUrl = 'https://qasset.meishesdk.com/material/pu/makeup/63BD3F32-D01B-4755-92D5-0DE361E4045A/63BD3F32-D01B-4755-92D5-0DE361E4045A.3.facemesh';
      const faceMeshUuid = await this.installAsset(faceMeshUrl);
      effectInstance.setStringVal("Face Mesh Face Size Custom Package Id", faceMeshUuid);
      // 美妆
      const makeupUrl = 'https://qasset.meishesdk.com/material/pu/makeup/FDFB2239-44D1-491D-85A1-7A537860118E/FDFB2239-44D1-491D-85A1-7A537860118E.1.makeup';
      const makeupUuid = await this.installAsset(makeupUrl);
      effectInstance.setStringVal("Makeup Lip Package Id", makeupUuid);
    
注解
更多特效参数请参考