MeiCam SDK For iOS  3.10.0
Question list
  1. How to generate a square video?
  2. How to generate a video with skin beautification effect?
  3. [How to generate a single video file by multiple materials?]] (3.How to generate a single video file by multiple materials?)
  4. How to realize picture-in-picture effect?
  5. How to add a watermark?
  6. There is no problem from the recording interface to the playback interface. After returning, the recording interface is black.
  7. NvsColor setting is invalid
  8. From the recording interface to the broadcast interface, the livewindow of the broadcast interface flashes black and then works normally
  9. Is the screen orientation of the video recorded by some mobile phones not working properly?
  10. Code confusion error caused by using Meishe SDK under Android?
  11. How to use H265 for video shooting and generation?
  12. What are the camera capture resolution for each level?
  13. What are the compile bitrate for each level?

Detailed explanation

1.How to generate a square video?

Assuming the video is taken in a vertical resolution of 1280*720, and users want to generate a 720*720 video.
1)Create timeline NvsVideoResolution videoEditRes; videoEditRes.imageWidth = 720; videoEditRes.imageHeight = 720; videoEditRes.imagePAR = (NvsRational){1, 1}; NvsRational videoFps = {25, 1}; NvsAudioResolution audioEditRes; audioEditRes.sampleRate = 48000; audioEditRes.channelCount = 2; audioEditRes.sampleFormat = NvsAudSmpFmt_S16; //Create timeline. m_timeline = [streamingContext createTimeline:&videoEditRes videoFps:&videoFps audioEditRes:&audioEditRes];

2)Create tracks and clips. Path is the absolute path of the clip. NvsVideoTrack videoTrack = [m_timeline appendVideoTrack]; NvsVideoClip clip = [videoTrack appendClip:path];

3)Zoom in the video. [clip setPan:0 andScan:1];

For detailed settings, please refer toPan and Scan
4)Generate video. Path is the path to generate video. [m_streamingContext compileTimeline:m_timeline startTime:0 endTime:m_timeline.duration outputFilePath:path videoResolutionGrade:COMPILE_VIDEO_RESOLUTION_GRADE_720 videoBitrateGrade:COMPILE_BITRATE_GRADE_HIGH flags:0];

2.How to generate a video with skin beautification effect?

1)Create timeline,track, and clip. This part is the same as that of question one.
2)Add beauty effect. [clip appendBeautyFx];

3)Generate video.

3.How to generate a single video file by multiple materials?

1)Add multiple materials to create multiple clips when creating tracks and clips. NvsVideoTrack videoTrack = [m_timeline appendVideoTrack]; NvsVideoClip clip1 = [videoTrack appendClip:path1]; NvsVideoClip clip2 = [videoTrack appendClip:path2]; NvsVideoClip clip3 = [videoTrack appendClip:path3]; NvsVideoClip clip4 = [videoTrack appendClip:path4]; NvsVideoClip clip5 = [videoTrack appendClip:path5];

2)Generate video. [m_streamingContext compileTimeline:m_timeline startTime:0 endTime:m_timeline.duration outputFilePath:path videoResolutionGrade:COMPILE_VIDEO_RESOLUTION_GRADE_720 videoBitrateGrade:COMPILE_BITRATE_GRADE_HIGH flags:0];

In this way a file can be generated.

4.How to realize picture-in-picture effect?

A simple picture-in-picture effect refers to superimposed effect of two images(videos) with two different resolutions, such as a horizontally-shot image(video) and a vertical-shot image(video), being added to two tracks. In addition, the Transform 2D effect can realize zooming in and out, rotation, and increasing the level transparency to the video. NvsVideoTrack videoTrack1 = [m_timeline appendVideoTrack]; NvsVideoTrack videoTrack2 = [m_timeline appendVideoTrack]; NvsVideoClip clip1 = [videoTrack1 appendClip:path1]; NvsVideoClip clip2 = [videoTrack2 appendClip:path2];

5.How to add a watermark?

There are two ways to add a watermark: one can be done by the sticker function, in which users are required to send a watermarked image which will be done by Meishe. The finished watermark file is a file with UUID as the name and .animatedsticker as the extension. With this file, users can realize the function of adding watermarks through API. NSMutableString *m_stickerId; NSString *packagePath = [appPath stringByAppendingPathComponent:"89740AEA-80D6-432A-B6DE-E7F6539C4121.animatedsticker"]; NvsAssetPackageManagerError error = [m_streamingContext.assetPackageManager installAssetPackage:packagePath license:nil type:NvsAssetPackageType_VideoFx sync:YES assetPackageId:m_stickerId]; if (error != NvsAssetPackageManagerError_NoError && error != NvsAssetPackageManagerError_AlreadyInstalled) { NSLog("Failed to install video fx package!"); package1Valid = false; }

[m_timeline addAnimatedSticker:0 duration:m_timeline.duration animatedStickerPackageId:_stickerPackageId];

The second way of adding a watermark is invocate the addWatermark() interface in the NvsTimeline class. [m_timeline addWatermark:path displayWidth:0 displayHeight:0 opacity:1 position:NvsTimelineWatermarkPosition_TopRight marginX:0 marginY:0];//Path is the path of the watermark file, which must be in a PNG or JPG format.

6.There is no problem from the recording interface to the playback interface. After returning, the recording interface is black.

Check if the connectCapturePreviewWithLiveWindow() interface in the NvsStreamingContext class has been called normally, or if users call stop() on the NvsStreamingContext after calling startCapturePreivew(). Similarly, the case that from recording interface to play interface displays a black screen which might be caused by calling stop() of NvsStreamingContext after playbackTimeline(). It is also possible that the connectTimelineWithLiveWindow method on the NvsStreamingContext has not been called or called abnormally.

7.NvsColor setting is invalid

The NvsClor's fields are in float type, and R, G, B, and A have values from 0 to 1. If the given color values are 100, 100, 100 , they need to be divided by 255 respectively.

8.From the recording interface to the playback interface, the livewindow of the playback interface flashes to black and back to normal

Calling playbackTimeline to play needs to preview for a while. In order to avoid this problem, users need to first call seekTimeline interface to 0 position. in this way the flash black problem will not occur.

9.Is the screen orientation of the video recorded by some mobile phones not working properly?

The reasons may be that some mobile phone players do not support automatic rotation, which may cause the image orientation to be abnormal during video playback, and this may misleading users.

10.Code confusion error caused by using Meishe SDK under Android?

When using code confusion, please be careful to avoid apply confusion operation on the following classes. The correct way to avoid this error is as follows:

-keep class com.cdv.**  {*;}
-keep class com.meicam.**  {*;}

When using effectsdk alone, please be careful to avoid apply confusion operation on the following classes. The correct way to avoid this error is as follows:

-keep class com.cdv.effect.**  {*;}
-keep class com.meicam.effect.**  {*;}

11.How to use H265 for video shooting and generation?

The use of H265 for video shooting is as follows: NSMutableDictionary *config = [[NSMutableDictionary alloc] init]; [config setValue:"hevc" forKey:NVS_COMPILE_VIDEO_ENCODEC_NAME]; [context startRecording:filePath withFlags:0 withRecordConfigurations:config];

The use of H265 for video generation is as follows: NSMutableDictionary *config = [[NSMutableDictionary alloc] init]; [config setValue:"hevc" forKey:NVS_COMPILE_VIDEO_ENCODEC_NAME];//h265 mode context.compileConfigurations = config;//Setted before compileTimeline API invocked [context compileTimeline:timeline startTime:0 endTime:timeline.duration outputFilePath:ouputPath videoResolutionGrade:NvsCompileVideoResolutionGrade720 videoBitrateGrade:NvsCompileBitrateGradeHigh flags:0];

Warning
Not all iPhone models can support H265 shooting and generation. If not, please configure the default settings for video shooting and generation.

12.What are the camera capture resolution for each level?


Attention: The SDK will evaluate user's phone's processing capability. If the phone is capable of process the recorded video, then the video will be recorded in resolution grade as set. If not, the SDK will lower the resolution grade to a level that the user's phone are capable of. For example: when using certain model of phone and setting to SUPER_HIGH grade, of which the phone are not capable of supporting, the SDK will lower the grade to HIGH or MEDIUM, thus resulting the grade recorded is not same as the grade user sets. Likewise, if recording without special effects (using the system's built-in camera), then the resolution grade will be determined according to camera's capability. If the camera cannot satisfy the grade setted,the SDK would lower the resolution grade when recording.

13.What are the compile bitrate for each level?