# Project file image path replacement

When synthesizing project files, it does not support pictures of network paths, but only supports video and audio of network paths. Therefore, when synthesizing projects with network pictures, you should first download the network paths to the local, and then use the local paths to replace the pictures in the project file. network path. (Note: just replace the path attribute in the resource tag of the project file)

xml before replacement

<?xml version="1.0" encoding="UTF-8"?>
<timeline>
    <resources>
        <resource id="1" path="http://***.mp4"/>
        <resource id="2" path="http://***.png"/>
        ...
    </resources>
    ...
</timeline>

xml after replacement

<?xml version="1.0" encoding="UTF-8"?>
<timeline>
    <resources>
        <resource id="1" path="http://***.mp4"/>
        <resource id="2" path="/opt/***.png"/>
        ...
    </resources>
    ...
</timeline>

# Replace the m3u8 path of the live cloud shear project file

For the project of synthesizing the live cloud cutting project, you need to download the live broadcast resource m3u8 in the xml to the local path first, and then replace the live cloud cutting project cost local path. (Note: just replace the path attribute in the resource tag of the project file)

# phonetic transcription

If speech recognition is required, the front end will pass in the clipping in point and clipping out point of the resource to be recognized through the interface. You should first call the synthesizer to generate audio, and then perform speech recognition according to the generated audio. If the length of the text after that time is too long, It should also be segmented by the tokenizer and returned to the front end.

For the synthesis of video speech recognition, the synthesized project file is:

<?xml version="1.0" encoding="UTF-8"?>
<timeline resWidth="960" resHeight="540" duration="%s" videoSize="16:9" controlSpeed="1" trackCount="1"
          noCaptions="false" sizeLevel="0">
    <videoTracks>
        <videoTrack index="0" volume="1" show="true">
            <videos>
                <video id="%s" index="0" type="1" path="%s" alphaM3u8Path="" alphaM3u8Url="" alphaPath="" inPoint="%s"
                       outPoint="%s" trimIn="%s" trimOut="%s" orgDuration="%s" volume="1" speed="1" bgBlur="false"
                       fadeInDuration="0" fadeOutDuration="0" extraRotation="0" reverse="false" noAudio="false"
                       uuid="%s" title="%s" leftChannelUrl="" rightChannelUrl=""/>
            </videos>
        </videoTrack>
    </videoTracks>
</timeline>

For the synthesis of audio, voice and video, the synthesized project file is:

<?xml version="1.0" encoding="UTF-8"?>
<timeline resWidth="960" resHeight="540" duration="%s" videoSize="16:9" controlSpeed="1" trackCount="1"
          noCaptions="false" sizeLevel="0">
    <audioTracks>
        <audioTrack index="0" volume="1" show="true">
            <audios>
                <audio id="%s" index="0" type="1" path="%s" alphaM3u8Path="" alphaM3u8Url="" alphaPath="" inPoint="%s"
                       outPoint="%s" trimIn="%s" trimOut="%s" orgDuration="%s" volume="1" speed="1" bgBlur="false"
                       fadeInDuration="0" fadeOutDuration="0" extraRotation="0" reverse="false" noAudio="false"
                       uuid="%s" title="%s" leftChannelUrl="" rightChannelUrl=""/>
            </audios>
        </audioTrack>
    </audioTracks>
</timeline>