Meishe AR Face Filter——ZEGO
This article explains how to integrate and use the Meishe AR Face Filter
plugin in your Web projects.
Precondition
Before you begin, ensure that the following requirements are met:
- For Windows or macOS computers, the following requirements must be met:
- PC browser version: ipad and pad devices are not supported.
- Chrome,Opera,360 browser: the kernel is Chrome, version 75 includes more than support;
- Firefox browser: The kernel version is greater than 58, including the above support (special cases: version 72 is not supported);
- Safari browser: minimum 15.4, 17.0 and above the best performance, recommended to the latest version
- Physical audio and video acquisition equipment.
- Internet connection possible.
- Installed Node.js and npm。
Integration And Invocation
1. Integrate SDK and plugin
You have created a project in ZEGO Console and applied for a valid AppID, please refer to "Project Information" in Console - Project Management for details. Before you begin, you need to integrate the audio and video SDK and the Meishe web SDK in your project.
tip:
The authentication files, template package files, model package files, and packet files in all the following examples can be obtained by contacting meishe Business.
1.1 Integrate ZEGO audio and video SDK
Meishe AR Face Filter needs to be used with ZEGO Web SDK 3.x (v3.0.0 or above). Refer to the following documentation to integrate the Web SDK and implement a basic video call:
[Implementing Video Calling](Web JavaScript 实时音视频 SDK 实现视频通话 - 开发者中心 - ZEGO即构科技)
1.2 Integration with Agora Video SDK
Follow these steps to integrate the plugin:
Use npm to integrate the 'Beautiful Beauty' plugin into your project.
Install the ZEGOSDK and Meishe AR Face Filter by running the following command:
typescriptnpm i zego-express-engine-webrtc meishewasmloader
Add the following code to your file to import the plugin module:
typescriptimport ZegoLocalStream from 'zego-express-engine-webrtc/sdk/code/zh/ZegoLocalStream.web'; import { ZegoExpressEngine } from 'zego-express-engine-webrtc'; import { WASMLoader } from 'meishewasmloader';
2. Configure the Meishe AR Face Filter environment
Meishe AR Face Filter needs to configure the response header, To make SharedArrayBuffer Available.
2.1 vite environment
Add the following configuration to the plugins configuration in vite.config.ts:
plugins:[
{
name: 'configure-response-headers',
configureServer: (server) => {
server.middlewares.use((_req, res, next) => {
res.setHeader('Cross-Origin-Embedder-Policy', 'require-corp')
res.setHeader('Cross-Origin-Opener-Policy', 'same-origin')
res.setHeader('Cross-Origin-Resource-Policy', 'cross-origin')
next()
})
},
configurePreviewServer: (server) => {
server.middlewares.use((_req, res, next) => {
res.setHeader('Cross-Origin-Embedder-Policy', 'require-corp')
res.setHeader('Cross-Origin-Opener-Policy', 'same-origin')
res.setHeader('Cross-Origin-Resource-Policy', 'cross-origin')
next()
})
},
},
...
]
2.2 webpack environment
Add the following configuration to plugins in webpack.config.ts:
devServer:{
headers: {
'Cross-Origin-Opener-Policy':'same-origin',
'Cross-Origin-Embedder-Policy':'require-corp',
'Cross-Origin-Resource-Policy': 'cross-origin'
},
...
}
The purpose of the above configuration is to add the above setting to the request header to instruct the browser to enable SharedArrayBuffer。
The above syntax is affected by the version of webpack
and vite
, and the API can be configured for each version.
3. init Meishe AR Face Filter
Refer to the following steps for initial Meishe AR Face Filter:
Import the API file NvEffectSdk.js
html<script src="xxx/meisheSDK-CDN-Url"></script>
WASMLoader is used to load wasm related files.
jsconst wasmLoader = WASMLoader({ // Load the progress callback showLoader: function (state) {}, // Failure callback showError(errorText: string) {}, // Success callback loadingFinished() {}, }); wasmLoader.loadEmscriptenModule("xxx/meisheSDK-CDN-Url", { effectSdk:true }); //The addresses in steps 1 and 2 need to be consistent, otherwise it will lead to unexpected results.
授权验证
MeiShe web EffectSDK In order to facilitate customer access testing, for the way of localhost launch, all effects can be used without authorization verification. When customers deploy in a non-localhost environment, they need to contact MeiShe business personnel to obtain sdk authorization files. Then perform authorization verification as follows.Wasm file load success callback function
loadingFinished
call verifySdkLicenseFileUrl()for authorization validation:if(nveGetEffectContextInstance().verifySdkLicenseFileUrl('xxx.lic')) { // Successful authorization } else { // authorization failed }
4. init ZEGO Engine
Fetching creates the ZegoExpressEngine
instance. Refer to "Project Information" in Console - Project Management for appId and server details.
const zg = new ZegoExpressEngine(appId, server);
5. Initialize the ARScene shortcut renderer
The purpose of initializing the ARScene shortcut renderer is to load the sdk and a series of model packages, data packages. In order to realize the recognition/loading of beauty, makeup, face, avatar, portrait, stickers, subtitles, background segmentation, special effects props and other functions.
具体配置如下:
import jszip from 'jszip';
let arSceneRenderer = new NveARSceneRenderer();
// arSceneRenderer is the Meishe AR Face Filter instance and needs to be saved
arSceneRenderer.init({
faceModelUrl: 'https://xxx',
eyecontourModelUrl: 'https://xxx',
avatarModelUrl: 'https://xxx',
segmentationModelUrl: 'https://xxx',
makeupDataUrl: 'https://xxx',
fakefaceDataUrl: 'https://xxx',
faceCommonDataUrl: 'https://xxx',
advancedBeautyDataUrl: 'https://xxx',
detectionMode: 32768 | 32,//sdkFlag,
ratio: width + ':' + height,
mirror: true,
jszip,
})
The type of arSceneRenderer
returned after initializing the Meishe AR Face Filter
plug-in is NveARSceneRenderer, and various attribute methods can be found from the type document.
6. Open Meishe AR Face Filter plugin special effects
Follow these steps to enable special effects:
Call browser
navigator. MediaDevices. GetUserMedia
method open cameras, to obtain the initial flow:typescriptconst stream = navigator.mediaDevices.getUserMedia()
Call the SDK's
pushMediaStream
method to send the stream to the SDK:typescriptarSceneRenderer.pushMediaStream(stream);
Call the SDK's
getOutputStream
method to get the processed stream:typescriptconst sdkStream = arSceneRenderer.getOutputStream();
Call the shortcut renderer's
enableBeauty
method to enable basic beauty:typescriptarSceneRenderer.enableBeauty(true);
7.Connect the ZEGO SDK
Take the video stream obtained in step 6 and 3 from the Beautiful Photo SDK as a third party stream, and call createZegoStream to create an
ZEGO
video stream.typescriptconst zegoStream = await zg.createZegoStream({ custom: { video: { source: sdkStream as MediaStream, } } });
Log in to the room by calling ZEGOAPI loginRoom ,Related process and parameters can be seen on the official website of the construction of audio and video call
typescriptzg.loginRoom( roomid, token, { userID: userid, userName:nickName }, { userUpdate: true }, );
Use startPublishingStream to push the stream
typescriptlet streamID = new Date().getTime().toString(); zg.startPublishingStream(streamID, zegoStream, { videoCodec: 'H264' });
Preview the streaming video locally
localStream.playCaptureVideo(document.getElementById('loacl-preview'), { objectFit: 'cover', });
sample code
下面列出了一段实现插件功能的代码以供参考。
vite.config.ts config
typescriptexport default defineConfig({ plugins: [ react(), { name: 'configure-response-headers', configureServer: (server) => { server.middlewares.use((_req, res, next) => { res.setHeader('Cross-Origin-Embedder-Policy', 'require-corp') res.setHeader('Cross-Origin-Opener-Policy', 'same-origin') res.setHeader('Cross-Origin-Resource-Policy', 'cross-origin') next() }) }, configurePreviewServer: (server) => { server.middlewares.use((_req, res, next) => { res.setHeader('Cross-Origin-Embedder-Policy', 'require-corp') res.setHeader('Cross-Origin-Opener-Policy', 'same-origin') res.setHeader('Cross-Origin-Resource-Policy', 'cross-origin') next() }) }, }, ], })
Functional code
typescriptimport jszip from 'jszip'; import { WASMLoader } from 'meishewasmloader'; import { ZegoExpressEngine } from 'zego-express-engine-webrtc'; const wasmLoader = WASMLoader({ // Load the progress callback showLoader: function (state) {}, // Failure callback showError(errorText: string) {}, // Success callback loadingFinished() {}, }); wasmLoader.loadEmscriptenModule("xxx/meishe-SDK-CDN-url", { effectSdk:true }); let arSceneRenderer = new NveARSceneRenderer(); await arSceneRenderer.init({ faceModelUrl: 'https://xxx', eyecontourModelUrl: 'https://xxx', avatarModelUrl: 'https://xxx', segmentationModelUrl: 'https://xxx', makeupDataUrl: 'https://xxx', fakefaceDataUrl: 'https://xxx', faceCommonDataUrl: 'https://xxx', advancedBeautyDataUrl: 'https://xxx', detectionMode: 32768 | 32, ratio: width + ':' + height, sdkCDNUrl: 'https://xxx', licFileUrl: 'xxx.lic', mirror: true jszip }) // Initialize the ZEGO engine const zg = new ZegoExpressEngine(appId, server); // Call the browser API to start the camera to get the audio and video stream const stream = await navigator.mediaDevices.getUserMedia(); // Pass it to MeiSheSDK arSceneRenderer.pushMediaStream(stream); // Obtain the processed audio and video streams for use as third-party streams let sdkStream = arSceneRenderer.getOutputStream(); // Enable basic beauty function arSceneRenderer.enableBeauty(true); // Call createZegoStream to create a ZEGO video stream const zegoStream = await zg.createZegoStream({ custom: { video: { source: sdkStream as MediaStream, } }, }); // Call loginRoom to login to the room await zg.loginRoom( roomid, token, { userID: userid, userName: nickName }, { userUpdate: true }, ); // Push stream Id, need to be unique let streamID = new Date().getTime().toString(); // Call startPublishingStream to push the stream zg.startPublishingStream(streamID, zegoStream, { videoCodec: 'H264' }); // Local preview zegoStream.playCaptureVideo(document.getElementById('local-preview'), { objectFit: 'cover', }); // Set the beauty const effectList = [ { key: "Advanced Beauty Intensity", intensity: 1, }, { key: "Beauty Whitening", intensity: 1, }, ]; // Add beauty effectList.push([ { url: "https://xxx.makeup", licUrl: "xxxx", intensity: 1, } ]) // Add filters effectList.push([ { url: "https://xxxx", licUrl: "xxxx", intensity: 1, } ]) // Apply filters and effects arSceneRenderer.setEffectList(effectList);
api reference
Meishe AR Face Filter API
NveARSceneRenderer
new NveARSceneRenderer();
Create the SDK shortcut renderer instance.
init
arSceneRenderer.init()
Initialize the SDK shortcut renderer and return the instance, Plug-in instance class for NveARSceneRenderer The init method is under the NveARSceneRenderer
class.
enableBeauty
enableBeauty()
Turn on/off beauty effects to control only the built-in beauty effects. This method is located under the NveARSceneRenderer
plug-in instance.
setEffectList
setEffectList()
Set the list of effects. You can set beauty, beauty makeup, beauty type, filter, props, background. This method is located under the NveARSceneRenderer
plug-in instance.
getEffectList
getEffectList()
Gets a list of all the effects that have been set, including added beauty, filters, props, and more. This method is located under the NveARSceneRenderer
plug-in instance.
createExternalEffectInstance
createExternalEffectInstance()
Create extended effects instances for creating captions and stickers instances. This method is located under the NveARSceneRenderer
plug-in instance.
appendExternalEffectInstance
appendExternalEffectInstance()
Add an extended effects instance to the plugin instance. This method is located under the NveARSceneRenderer
plug-in instance.
setExternalEffectInstanceList
setExternalEffectInstanceList()
Gets a list of all added extension effects. This method is located under the NveARSceneRenderer
plug-in instance.
Special effect setting
Beauty
Built-in effects
Built-in special effects do not need special effects package, which belongs to the built-in ability of the smart beauty effects plug-in, but you need to know the key name. The specific content can be viewed in the Meishu SDK Web document, which is roughly as follows:
typescript// key is the built-in special effects keyword used to distinguish special effects functions // intensity indicates the specific intensity. The value ranges from [-1,1] to [0,1] // sdk built-in hundreds of built-in effects, range and key values can be found by yourself const beautyArray = [ { key:'Advanced Beauty Intensity', intensity:1 }, { key:'Beauty Whitening', intensity:1 }, { key:'Beauty Reddening', intensity:1 }, { key:'Face Mesh Face Width Degree', intensity:1 }, { key:'Face Mesh Face Length Degree', intensity:1 }, { key:'Face Mesh Face Size Degree', intensity:1 }, { key:'Face Mesh Forehead Height Degree', intensity:1 }, ... ] arSceneRenderer.setEffectList(beautyArray)
Beauty template
Unlike built-in effects, beauty templates need to be passed into the template package, not the key value. The code is as follows:
typescriptconst templateArray = [ { url:"https://xxxx", licUrl: "xxxx", intensity:1, }, { url:"https://xxxx", licUrl: "xxxx", intensity:1, }, ] arSceneRenderer.setEffectList(templateArray)
Makeup
normal package
Ordinary beauty bags have lipstick, eyeshadow, eyebrows, eyelashes, eyeliner, blush, shine, shadow, contact lenses, makeup and a series of categories. The configuration method is the same as the beauty template, and the code is as follows:
typescriptconst makeupArray = [ { url:"https://xxx.makeup", licUrl: "xxxx", intensity:1, }, { url:"https://xxx.makeup", licUrl: "xxxx", intensity:1, }, ] arSceneRenderer.setEffectList(makeupArray)
Complete package
The whole bag is the integration of a series of ordinary bags. The configuration method is as follows:
typescriptarSceneRenderer.setEffectList([ { url:"https://xxx.zip", }, ])
Filter
Filters are configured in the same way as beauty bags, and the code is as follows:
const filterArray = [
{
url:"https://xxxx",
licUrl: "xxxx",
intensity:1,
}
]
arSceneRenderer.setEffectList(filterArray)
Virtual background
background blur
You need to specify the background blur special effects package, as follows:
typescriptarSceneRenderer.setEffectList([ { url:"https://xxx.videofx", licUrl:'' } ])
Background substitution
Background The replacement configuration is as follows:
typescriptarSceneRenderer.setEffectList([ {segmentationBackgroundUrl:"https://xxx.png"}//image url ])
AR scene
The configuration method is the same as the filter, and the code is as follows:
arSceneRenderer.setEffectList([
{
url: "https://xxx.arscene",
licUrl: ''
}
])
Caption
1 common caption
Common subtitling categories include flower characters, dynamic subtitling, word-for-word subtitling, bubble subtitling and so on. The configuration method is not the same as beauty makeup.
createExternalEffectInstance
method parameter information table:
Parameters | Type | Required | Default | Explanation |
---|---|---|---|---|
modular | Boolean | No | false | Whether it is module captioning, the module captioning type can add effects such as bubbles and animations. |
text | String | Yes | text | |
inPoint | Number | Yes | begin time.The unit is microsecond | |
duration | Number | Yes | The unit is microsecond,set to Number.MAX_SAFE_INTEGER indicates continuous display. | |
url | String | No | common caption package address | |
licUrl | String | No | Address of the ordinary package authorization file | |
fontFileUrl | String | No | If the font file address is not set, the non-English text will be displayed incorrectly because the corresponding font cannot be found. | |
captionRendererUrl | String | No | Floral effect style package address. | |
captionRendererLicUrl | String | No | The address of the license file corresponding to the flower effect style package. | |
captionContextUrl | String | No | Bubble Effect style package address. | |
captionContextLicUrl | String | No | Bubble effect style package address. | |
captionAnimationUrl | String | No | Animation effects style package address. | |
captionAnimationLicUrl | String | No | License file address corresponding to the animation effects style package. | |
animationPeriod | Number | No | License file address corresponding to the animation effects style package. Animation effect period, in milliseconds. | |
captionInAnimationUrl | String | No | Enter the license file address corresponding to the animation effects style package. | |
captionInAnimationLicUrl | String | No | Enter animation style package address. | |
inAnimationDuration | Number | No | Enter the duration of the animation effect in milliseconds. | |
captionOutAnimationUrl | String | No | Create animation style package address. | |
captionOutAnimationLicUrl | String | No | The address of the license file corresponding to the animation style package. | |
outAnimationDuration | Number | No | Duration of the animation, in milliseconds. | |
positionX | Number | No | Horizontal position, normalized value, range [-1, 1], center 0, positive direction to the right. | |
positionY | Number | No | Vertical position, normalized value, range [-1, 1], center is 0, upward is positive direction. | |
position | String | No | For orientation information, the values that can be set include top-left, top, top-right, left, center, right, bottom-left, bottom, and bottom-right. |
add
Subtitle creation method for
createExternalEffectInstance
, subtitles add methods forappendExternalEffectInstance
.In the following example, the various subtitle effects are created using the same API with different parameters. Refer to the above parameter table for specific parameters
typescript// common caption const caption = await arSceneRenderer.createExternalEffectInstance( { text: "caption", inPoint: 0, duration: 5000000, } ); // Create regular subtitles with subtitles style const caption = await arSceneRenderer.createExternalEffectInstance( { text: "caption", inPoint: 0, duration: 5000000, url: "https://xxx.captionstyle", licUrl: "", } ); // Create regular captions with captioning styles and a splash effect const caption = await arSceneRenderer.createExternalEffectInstance( { text: "caption", inPoint: 0, duration: 5000000, url: "https://xxx.captionstyle", licUrl: "", captionRendererUrl: "https://xxx.captionrenderer", captionRendererLicUrl: "", } ); // Create regular captions with location information const caption = await arSceneRenderer.createExternalEffectInstance( { text: "caption", inPoint: 0, duration: 5000000, positionX: -0.6, positionY: 0.9, } ); // Create regular captions with orientation information const caption = await arSceneRenderer.createExternalEffectInstance( { text: "caption", inPoint: 0, duration: 5000000, position: "top-right", } ); // Create plain subtitles with fonts const caption = await arSceneRenderer.createExternalEffectInstance( { text: "caption", inPoint: 0, duration: 5000000, fontFileUrl: "https://xxx.ttf", } ); // Create unstyled module captions const caption = await arSceneRenderer.createExternalEffectInstance( { modular: true, text: "modularCaption", inPoint: 0, duration: 5000000, } ); // Create module captions with a splash effect const caption = await arSceneRenderer.createExternalEffectInstance( { modular: true, text: "modularCaption", inPoint: 0, duration: 5000000, captionRendererUrl: "https://xxx.captionrenderer", captionRendererLicUrl: "", } ); // Create a module caption with a bubble effect const caption = await arSceneRenderer.createExternalEffectInstance( { modular: true, text: "modularCaption", inPoint: 0, duration: 5000000, captionContextUrl: "https://xxx.captioncontext", captionContextLicUrl: "", } ); // Create module captions with animated effects const caption = await arSceneRenderer.createExternalEffectInstance( { modular: true, text: "modularCaption", inPoint: 0, duration: 5000000, captionAnimationUrl: "https://xxx.captionanimation", captionAnimationLicUrl: "", animationPeriod: 2500, } ); // Use the interface of the caption instance to modify the other attributes of the caption caption.scaleCaption2(0.8); caption.setUnderline(true); // Adding a caption instance arSceneRenderer.appendExternalEffectInstance(caption);
The normal subtitle instance type is NveCaption, and various attributes and methods are configured. You can refer to the linked document.
get
To obtain a list of current setting all extension effects, use
getExternalEffectInstanceList
method. The code is as follows:typescript// Get a list of all expanded effects.Note that the expanded effects are not just captions.There may be other effects that need to be filtered // You can also maintain your own list of already set subtitle instances let instanceList = arSceneRenderer.getExternalEffectInstanceList(); let captionInstanceList = instanceList.filter(instance=>instance instanceof NveCaption)
delete
Through
getExternalEffectInstanceList
access list, find the same as the current subtitle instance id item and delete. ThroughsetExternalEffectInstanceList
resetting list. The code is as follows:typescriptlet deleteInstanceId = 'xxxx'// The id of the subtitle instance to remove let instanceList = arSceneRenderer.getExternalEffectInstanceList()// Get a list of extended effects on deletion without filtering let index = instanceList.findIndex(instance=>instance.id!==deleteInstanceId)// Gets the index of the deleted child if(index!==-1){ instanceList[index].release()// release resource instanceList.splice(index,1) arSceneRenderer.setExternalEffectInstanceList(instanceList);// Resets the list of extended effects }
2 Compound caption
createExternalEffectInstance
parameters listed in reference ordinary subtitles form.
The code is as follows:
const compoundCaption = await arSceneRenderer.createExternalEffectInstance(
{
inPoint: 0,
duration: 5000000,
url: "https://xxx.compoundcaption",
licUrl: "",
}
);
// Create compound captions with location information
const compoundCaption = await arSceneRenderer.createExternalEffectInstance(
{
inPoint: 0,
duration: 5000000,
url: "https://xxx.compoundcaption",
licUrl: "",
positionX: -0.6,
positionY: 0.9,
}
);
// Create a composite caption with location information Create a composite caption with orientation location information
const compoundCaption = await arSceneRenderer.createExternalEffectInstance(
{
inPoint: 0,
duration: 5000000,
url: "https://xxx.compoundcaption",
licUrl: "",
position: "top-right",
}
);
// Modify other properties using the interface of the composite caption instance
compoundCaption.scaleCaption2(0.8);
compoundCaption.setText(0, "caption0");
// Add a composite caption instance
arSceneRenderer.appendExternalEffectInstance(compoundCaption);
The instance type of the combined caption is NveCompoundCaption, which can be configured in the same way as normal captions
Animated Sticker
Stickers are created in a similar way to captions, but the parameters are somewhat different.The table of parameters is as follows:
Parameters | Type | Required | Default | Explanation |
---|---|---|---|---|
inPoint | Number | Yes | Start time of the animated sticker in microseconds. | |
duration | Number | Yes | Duration of the animated sticker in microseconds, set to Number.MAX_SAFE_INTEGER to indicate continuous display. | |
url | String | No | Animated sticker pack address. | |
licUrl | String | No | The address of the license file corresponding to the animation sticker package. | |
animatedStickerAnimationUrl | String | No | Animation style package address. | |
animatedStickerAnimationLicUrl | String | No | The license file address for the animation style pack. | |
animationPeriod | Number | No | Animation effect period in milliseconds. | |
animatedStickerInAnimationUrl | String | No | Enter animation style package address. | |
animatedStickerInAnimationLicUrl | String | No | Enter the animation style package corresponding to the license file address. | |
inAnimationDuration | Number | No | Enter the duration of the animation effect in milliseconds. | |
animatedStickerOutAnimationUrl | String | No | Create animation style package address. | |
animatedStickerOutAnimationLicUrl | String | No | The address of the license file corresponding to the animation style package. | |
outAnimationDuration | Number | No | Duration of the animation, in milliseconds. | |
positionX | Number | No | Horizontal position, normalized value, range [-1, 1], center 0, positive direction to the right. | |
positionY | Number | No | Vertical position, normalized value, range [-1, 1], center is 0, upward is positive direction. | |
position | String | No | For orientation information, the values that can be set include top-left, top, top-right, left, center, right, bottom-left, bottom, and bottom-right. |
The code is as follows:
// Create stickers that don't animate
const sticker = await arSceneRenderer.createExternalEffectInstance(
{
inPoint: 0,
duration: 5000000,
url: "https://xxx.animatedsticker",
licUrl: "",
}
);
// Create animated stickers
const sticker = await arSceneRenderer.createExternalEffectInstance(
{
inPoint: 0,
duration: 5000000,
url: "https://xxx.animatedsticker",
licUrl: "",
animatedStickerAnimationUrl: "https://xxx.animatedstickeranimation",
animatedStickerAnimationLicUrl: "",
animationPeriod: 5000,
}
);
// Create stickers with location information
const sticker = await arSceneRenderer.createExternalEffectInstance(
{
inPoint: 0,
duration: 5000000,
url: "https://xxx.animatedsticker",
licUrl: "",
positionX: -0.6,
positionY: 0.9,
}
);
// Create stickers with orientation information
const sticker = await arSceneRenderer.createExternalEffectInstance(
{
inPoint: 0,
duration: 5000000,
url: "https://xxx.animatedsticker",
licUrl: "",
position: "top-right",
}
);
// Modify other properties using the interface of the animated sticker instance
sticker.scaleAnimatedSticker2(0.5);
// Add an instance of animated stickers
arSceneRenderer.appendExternalEffectInstance(sticker);
The instance type of the sticker is NveAnimatedSticker, which can be operated and configured according to the idea of subtitles.