是否可以将两个视频文件合并为一个文件,一个屏幕 iOS?

Is it possible to merge two video files to one file, one screen in iOS?

我是视频编程新手。我正在尝试练习它,但我遇到了麻烦,将两个视频文件合并为一个。

我说的合并如下..

我有第一个这样的视频

第二个视频也这样

我希望他们像这样合并

我不想使用 2 个视频播放器,因为我想将合并后的视频文件发送给某人。我搜索了一整天来解决这个问题,但找不到解决方法。

我写的代码参考了 this link 但它只显示第一个视频,没有合并。

我的代码:

NSURL *firstURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"video1" ofType:@"mp4"]]
AVURLAsset  *firstAsset = [[AVURLAsset alloc]initWithURL:firstURL options:nil];

NSURL *secondURL = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"video2" ofType:@"mp4"]];
VURLAsset  *secondAsset = [[AVURLAsset alloc]initWithURL:secondURL options:nil];

AVMutableComposition* mixComposition = [AVMutableComposition composition];
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                  preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration)
                    ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                     atTime:kCMTimeZero error:nil];

AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                   preferredTrackID:kCMPersistentTrackID_Invalid];

[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration)
                     ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                      atTime:kCMTimeZero error:nil];

[secondTrack setPreferredTransform:CGAffineTransformMakeScale(0.25f,0.25f)];

NSArray *dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
NSString *docsDir = [dirPaths objectAtIndex:0];
NSString *outputFilePath = [docsDir stringByAppendingPathComponent:[NSString stringWithFormat:@"FinalVideo.mov"]];

NSLog(@"%@", outputFilePath);

NSURL *outputFileUrl = [NSURL fileURLWithPath:outputFilePath];
if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
    [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];


AVAssetExportSession* assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
assetExport.outputFileType = @"com.apple.quicktime-movie";
assetExport.outputURL = outputFileUrl;

[assetExport exportAsynchronouslyWithCompletionHandler: ^(void ) {

     switch (assetExport.status) {
         case AVAssetExportSessionStatusFailed:
             NSLog(@"AVAssetExportSessionStatusFailed");
             break;
         case AVAssetExportSessionStatusCompleted:
             NSLog(@"AVAssetExportSessionStatusCompleted");
             break;
         case AVAssetExportSessionStatusWaiting:
             NSLog(@"AVAssetExportSessionStatusWaiting");
             break;
         default:
             break;
     }
 }
 ];

我错过了什么?我不知道如何解决这个问题。

感谢任何想法。 谢谢。

编辑:

我做了一个新代码,引用了 link matt 写的,谢谢 matt。但是当我尝试导出它时,只导出了第一个视频。没有在一起..:(

我的新密码是..

NSURL *originalVideoURL1 = [[NSBundle mainBundle] URLForResource:@"video1" withExtension:@"mov"];
NSURL *originalVideoURL2 = [[NSBundle mainBundle] URLForResource:@"video2" withExtension:@"mov"];


AVURLAsset *firstAsset = [AVURLAsset URLAssetWithURL:originalVideoURL1 options:nil];
AVURLAsset *secondAsset = [AVURLAsset URLAssetWithURL:originalVideoURL2 options:nil];

AVMutableComposition* mixComposition = [[AVMutableComposition alloc] init]; //[AVMutableComposition composition];

NSError *error = nil;
AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&error];

if(error) {
    NSLog(@"firstTrack error!!!. %@", error.localizedDescription);
}

AVMutableCompositionTrack *secondTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
[secondTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:&error];

if(error) {
    NSLog(@"secondTrack error!!!. %@", error.localizedDescription);
}


AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, firstAsset.duration);

AVMutableVideoCompositionLayerInstruction *firstLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:firstTrack];
CGAffineTransform scale = CGAffineTransformMakeScale(0.7, 0.7);
CGAffineTransform move = CGAffineTransformMakeTranslation(230, 230);
[firstLayerInstruction setTransform:CGAffineTransformConcat(scale, move) atTime:kCMTimeZero];

AVMutableVideoCompositionLayerInstruction *secondLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:secondTrack];
CGAffineTransform secondScale = CGAffineTransformMakeScale(1.2, 1.5);
CGAffineTransform secondMove = CGAffineTransformMakeTranslation(0, 0);
[secondLayerInstruction setTransform:CGAffineTransformConcat(secondScale, secondMove) atTime:kCMTimeZero];

mainInstruction.layerInstructions = @[firstLayerInstruction, secondLayerInstruction];

AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
mainCompositionInst.instructions = @[mainInstruction];
mainCompositionInst.frameDuration = CMTimeMake(1, 30);
mainCompositionInst.renderSize = CGSizeMake(640, 480);

AVPlayerItem *newPlayerItem = [AVPlayerItem playerItemWithAsset:mixComposition];
newPlayerItem.videoComposition = mainCompositionInst;

AVPlayer *player = [[AVPlayer alloc] initWithPlayerItem:newPlayerItem];

AVPlayerLayer *playerLayer =[AVPlayerLayer playerLayerWithPlayer:player];

[playerLayer setFrame:self.view.bounds];
[self.view.layer addSublayer:playerLayer];
[player seekToTime:kCMTimeZero];
[player play]; // play is Good!!


NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory,NSUserDomainMask, YES);
NSString *documentsDirectory = [paths objectAtIndex:0];

NSString *tempS2 = [documentsDirectory stringByAppendingPathComponent:@"FinalVideo.mov"];

if([[NSFileManager defaultManager] fileExistsAtPath:tempS2])
{
    [[NSFileManager defaultManager] removeItemAtPath:tempS2 error:nil];
}


NSURL *url = [[NSURL alloc] initFileURLWithPath: tempS2];

AVAssetExportSession *exportSession = [[AVAssetExportSession alloc]
                                       initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];

exportSession.outputURL=url;

NSLog(@"%@", [exportSession supportedFileTypes]);

exportSession.outputFileType = AVFileTypeQuickTimeMovie;
[exportSession exportAsynchronouslyWithCompletionHandler:^{
    if (exportSession.status==AVAssetExportSessionStatusFailed) {
        NSLog(@"failed");
    }
    else {
        NSLog(@"AudioLocation : %@",tempS2);
    }
}];

如何导出我的 mixComposition 和 layerInstruction?

请多多指教

谢谢。

关于您第二次编辑中的代码,正如您已将您的 AVMutableVideoComposition 告知 AVPlayerItem 一样,您也需要告知 AVAssetExportSession

exportSession.videoComposition = mainCompositionInst;
// exportAsynchronouslyWithCompletionHandler etc

N.B. 确保在设置说明持续时间时选择两个曲目的持续时间中较长的一个:

mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, CMTimeMaximum(firstAsset.duration, secondAsset.duration));

AVPlayer 不介意你是否弄错了,但是 AVAssetExportSession 会并且会 return 一个 AVErrorInvalidVideoComposition (-11841) 错误。

N.B。 2 你的 AVPlayer 实际上并没有超出范围,但当我看到它时它让我感到紧张。如果我是你,我会把它分配给 属性。