IOS开发中视频编辑、裁剪、拼接、处理教程
IOS开发中视频编辑、裁剪、拼接、处理教程。用代码在简单视频编辑中,主要就是加美颜、水印(贴图)、视频截取、视频拼接、音视频的处理,在美颜中,使用GPUImage即可实现多种滤镜、磨皮美颜的功能,并且可以脸部识别实时美颜等功能,这个有很多成熟的处理方案,所以现在主要说后面的水印(贴图)、视频截取、视频拼接、音视频的处理,在文章结尾会给出一个完整的测试demo,该demo可以操作视频之后保存到系统相册,文章主要说明下注意的点。
原理正如上篇提到的,因为GPUImage只是对视频进行滤镜处理,并没有涉及到视频轨和音轨的处理,所以在视频的处理裁剪等编辑上面主要还是使用的AVFoundation对视频轨和音轨进行处理。
一、视频裁剪完整源码:
//使用gpuimage重新录制一次 -(void)saveVedioPath:(NSURL*)vedioPath WithFileName:(NSString*)fileName andCallBack:(JLXCommonToolVedioCompletionHandler)competion { self.completionHandler = competion; // 滤镜 filter = [[GPUImageAlphaBlendFilter alloc] init]; // //mix即为叠加后的透明度,这里就直接写1.0了 [(GPUImageDissolveBlendFilter *)filter setMix:1.0f]; // 播放 NSURL *sampleURL = vedioPath; AVAsset *asset = [AVAsset assetWithURL:sampleURL]; movieFile = [[GPUImageMovie alloc] initWithAsset:asset]; movieFile.runBenchmark = YES; movieFile.playAtActualSpeed = NO; AVAssetTrack *videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp; //拍摄的时候视频是否是竖屏拍的 BOOL isVideoAssetvertical = NO; CGAffineTransform videoTransform = videoAssetTrack.preferredTransform; if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) { isVideoAssetvertical = YES; videoAssetOrientation_ = UIImageOrientationUp;//正着拍 } if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) { // videoAssetOrientation_ = UIImageOrientationLeft; isVideoAssetvertical = YES; videoAssetOrientation_ = UIImageOrientationDown;//倒着拍 } if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) { isVideoAssetvertical = NO; videoAssetOrientation_ = UIImageOrientationLeft;//左边拍的 } if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) { isVideoAssetvertical = NO; videoAssetOrientation_ = UIImageOrientationRight;//右边拍 } GPUImageView *filterView = [[GPUImageView alloc] initWithFrame:CGRectMake(0, 0, asset.naturalSize.width, asset.naturalSize.height)]; [filterView setTransform:CGAffineTransformMakeRotation(M_PI_2)]; UIView *view = [[UIView alloc] initWithFrame:CGRectMake(0, 0, asset.naturalSize.width, asset.naturalSize.height)]; [view setBackgroundColor:[UIColor clearColor]]; GPUImageUIElement *uielement = [[GPUImageUIElement alloc] initWithView:view]; NSString *pathToMovie = [NSHomeDirectory() stringByAppendingPathComponent:[NSString stringWithFormat:@"Documents/%@.mp4",fileName]]; unlink([pathToMovie UTF8String]); NSURL *movieURL = [NSURL fileURLWithPath:pathToMovie]; movieWriter = [[GPUImageMovieWriter alloc] initWithMovieURL:movieURL size:CGSizeMake(asset.naturalSize.width, asset.naturalSize.height)]; GPUImageFilter* progressFilter = [[GPUImageFilter alloc] init]; [movieFile addTarget:progressFilter]; [progressFilter addTarget:filter]; [uielement addTarget:filter]; movieWriter.shouldPassthroughAudio = YES; if ([[asset tracksWithMediaType:AVMediaTypeAudio] count] > 0){ movieFile.audioEncodingTarget = movieWriter; } else {//no audio movieFile.audioEncodingTarget = nil; } [movieFile enableSynchronizedEncodingUsingMovieWriter:movieWriter]; // 显示到界面 [filter addTarget:filterView]; [filter addTarget:movieWriter]; [movieWriter startRecording]; [movieFile startProcessing]; __weak typeof(self) weakSelf = self; [progressFilter setFrameProcessingCompletionBlock:^(GPUImageOutput *output, CMTime time) { [uielement update]; }]; [movieWriter setCompletionBlock:^{ __strong typeof(self) strongSelf = weakSelf; [strongSelf->filter removeTarget:strongSelf->movieWriter]; [strongSelf->movieWriter finishRecording]; if (strongSelf.completionHandler) { strongSelf.completionHandler(movieURL,nil,isVideoAssetvertical); } }]; } /** 裁剪视频 @param videoPath 视频的路径 @param startTime 截取视频开始时间 @param endTime 截取视频结束时间,如果为0则为整个视频 @param videoSize 视频截取的大小,如果为0则不裁剪视频大小 @param videoDealPoint Point(x,y):传zero则为裁剪从0,0开始 @param fileName 文件名字 @param shouldScale 是否拉伸,false的话不拉伸,裁剪黑背景 */ - (void)saveVideoPath:(NSURL*)videoPath withStartTime:(float)startTime withEndTime:(float)endTime withSize:(CGSize)videoSize withVideoDealPoint:(CGPoint)videoDealPoint WithFileName:(NSString*)fileName shouldScale:(BOOL)shouldScale { if (!videoPath) { [SVProgressHUD dismiss]; return; } [SVProgressHUD showWithStatus:@"裁剪视频到系统相册"]; //1 创建AVAsset实例 AVAsset包含了video的所有信息 self.videoUrl输入视频的路径 //封面图片 NSDictionary *opts = [NSDictionary dictionaryWithObject:@(YES) forKey:AVURLAssetPreferPreciseDurationAndTimingKey]; videoAsset = [AVURLAsset URLAssetWithURL:videoPath options:opts]; //初始化视频媒体文件 bool isWXVideo = false; for (int i=0; i<videoasset.metadata.count; i++)="" {="" avmetadataitem="" *="" item="[videoAsset.metadata" objectatindex:i];="" nslog(@"="=====metadata:%@,%@,%@,%@",item.identifier,item.extraAttributes,item.value,item.dataType);" nsdictionary="" *dic="[self" strtoarrayornsdictionary:[nsstring="" stringwithformat:@"%@",item.value]];="" if="" ([[dic.allkeys="" objectatindex:0]="" isequaltostring:="" @"wxver"])="" iswxvideo="true;" [self="" savevediopath:videopath="" withfilename:@"wxvideo"="" andcallback:^(nsurl="" *asseturl,="" nserror="" *error,bool="" isvideoassetvertical)="" dispatch_after(dispatch_time(dispatch_time_now,="" (int64_t)(0="" nsec_per_sec)),="" dispatch_get_main_queue(),="" ^{="" nsstring="" *newvideopath="[NSHomeDirectory()" stringbyappendingpathcomponent:@"documents="" wxvideo.mp4"];="" gosavevideopath:[nsurl="" fileurlwithpath:newvideopath]="" withstarttime:starttime="" withendtime:endtime="" withsize:videosize="" withvideodealpoint:videodealpoint="" withfilename:filename="" shouldscale:shouldscale="" iswxvideoassetvertical:isvideoassetvertical];="" });="" }];="" break;="" }="" ([[videoasset="" trackswithmediatype:avmediatypeaudio]="" count]="=" 0){="" return;="" (!iswxvideo)="" gosavevideopath:videopath="" iswxvideoassetvertical:no];="" assetvertical="" gpuimage会把微信的竖屏渲染成横屏,横屏还是横屏="" -="" (void)gosavevideopath:(nsurl*)videopath="" withstarttime:(float)starttime="" withendtime:(float)endtime="" withsize:(cgsize)videosize="" withvideodealpoint:(cgpoint)videodealpoint="" withfilename:(nsstring*)filename="" shouldscale:(bool)shouldscale="" iswxvideoassetvertical:(bool)assetvertical{="" (!videopath)="" [svprogresshud="" dismiss];="" 1="" 创建avasset实例="" avasset包含了video的所有信息="" self.videourl输入视频的路径="" 封面图片="" *opts="[NSDictionary" dictionarywithobject:@(yes)="" forkey:avurlassetpreferprecisedurationandtimingkey];="" dictionarywithobjectsandkeys:@(yes),avurlassetpreferprecisedurationandtimingkey,avassetreferencerestrictionforbidnone,avurlassetreferencerestrictionskey,="" nil];="" videoasset="[AVURLAsset" urlassetwithurl:videopath="" options:opts];="" 初始化视频媒体文件="" 开始时间="" cmtime="" startcroptime="CMTimeMakeWithSeconds(startTime," 600);="" 结束时间="" endcroptime="CMTimeMakeWithSeconds(endTime," (endtime="=" 0)="" videoasset.duration.timescale);="" 2="" 创建avmutablecomposition实例.="" apple="" developer="" 里边的解释="" 【avmutablecomposition="" is="" a="" mutable="" subclass="" of="" avcomposition="" you="" use="" when="" want="" to="" create="" new="" composition="" from="" existing="" assets.="" can="" add="" and="" remove="" tracks,="" add,="" remove,="" scale="" time="" ranges.】="" avmutablecomposition="" *mixcomposition="[AVMutableComposition" composition];="" 有声音=""> 0){ //声音采集 AVURLAsset * audioAsset = [[AVURLAsset alloc] initWithURL:videoPath options:opts]; //音频通道 AVMutableCompositionTrack * audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; //音频采集通道 AVAssetTrack * audioAssetTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] lastObject]; [audioTrack insertTimeRange:CMTimeRangeMake(startCropTime, endCropTime) ofTrack:audioAssetTrack atTime:kCMTimeZero error:nil]; } //3 视频通道 工程文件中的轨道,有音频轨、视频轨等,里面可以插入各种对应的素材 AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; NSError *error; //把视频轨道数据加入到可变轨道中 这部分可以做视频裁剪TimeRange [videoTrack insertTimeRange:CMTimeRangeMake(startCropTime, endCropTime) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject] atTime:kCMTimeZero error:&error]; //3.1 AVMutableVideoCompositionInstruction 视频轨道中的一个视频,可以缩放、旋转等 AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, videoTrack.timeRange.duration); // 3.2 AVMutableVideoCompositionLayerInstruction 一个视频轨道,包含了这个轨道上的所有视频素材 AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; AVAssetTrack *videoAssetTrack = [[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject]; UIImageOrientation videoAssetOrientation_ = UIImageOrientationUp; //拍摄的时候视频是否是竖屏拍的 BOOL isVideoAssetvertical = NO; CGAffineTransform videoTransform = videoAssetTrack.preferredTransform; if (videoTransform.a == 0 && videoTransform.b == 1.0 && videoTransform.c == -1.0 && videoTransform.d == 0) { isVideoAssetvertical = YES; videoAssetOrientation_ = UIImageOrientationUp;//正着拍 } if (videoTransform.a == 0 && videoTransform.b == -1.0 && videoTransform.c == 1.0 && videoTransform.d == 0) { // videoAssetOrientation_ = UIImageOrientationLeft; isVideoAssetvertical = YES; videoAssetOrientation_ = UIImageOrientationDown;//倒着拍 } if (videoTransform.a == 1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == 1.0) { isVideoAssetvertical = NO; videoAssetOrientation_ = UIImageOrientationLeft;//左边拍的 } if (videoTransform.a == -1.0 && videoTransform.b == 0 && videoTransform.c == 0 && videoTransform.d == -1.0) { isVideoAssetvertical = NO; videoAssetOrientation_ = UIImageOrientationRight;//右边拍 } float scaleX = 1.0,scaleY = 1.0,scale = 1.0; CGSize originVideoSize; if (isVideoAssetvertical || Assetvertical) { originVideoSize = CGSizeMake([[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].height, [[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width); } else{ originVideoSize = CGSizeMake([[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width, [[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].height); } float x = videoDealPoint.x; float y = videoDealPoint.y; if (shouldScale) { scaleX = videoSize.width/originVideoSize.width; scaleY = videoSize.height/originVideoSize.height; scale = MAX(scaleX, scaleY); if (scaleX>scaleY) { NSLog(@"竖屏"); } else{ NSLog(@"横屏"); } } else{ scaleX = 1.0; scaleY = 1.0; scale = 1.0; } if (Assetvertical) { CGAffineTransform trans = CGAffineTransformMake(videoAssetTrack.preferredTransform.a*scale, videoAssetTrack.preferredTransform.b*scale, videoAssetTrack.preferredTransform.c*scale, videoAssetTrack.preferredTransform.d*scale, videoAssetTrack.preferredTransform.tx*scale-x+720, videoAssetTrack.preferredTransform.ty*scale-y); // [videolayerInstruction setTransform:trans atTime:kCMTimeZero]; CGAffineTransform trans2 = CGAffineTransformRotate(trans, M_PI_2); [videolayerInstruction setTransform:trans2 atTime:kCMTimeZero]; } else{ CGAffineTransform trans = CGAffineTransformMake(videoAssetTrack.preferredTransform.a*scale, videoAssetTrack.preferredTransform.b*scale, videoAssetTrack.preferredTransform.c*scale, videoAssetTrack.preferredTransform.d*scale, videoAssetTrack.preferredTransform.tx*scale-x, videoAssetTrack.preferredTransform.ty*scale-y); [videolayerInstruction setTransform:trans atTime:kCMTimeZero]; } //裁剪区域 // [videolayerInstruction setCropRectangle:CGRectMake(0, 0, 720, 720) atTime:kCMTimeZero]; // 3.3 - Add instructions mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil]; //AVMutableVideoComposition:管理所有视频轨道,可以决定最终视频的尺寸,裁剪需要在这里进行 AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition]; CGSize naturalSize; // if(isVideoAssetvertical){ // naturalSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width); // } else { // naturalSize = videoAssetTrack.naturalSize; // } naturalSize = originVideoSize; int64_t renderWidth = 0, renderHeight = 0; if (videoSize.height ==0.0 || videoSize.width == 0.0) { renderWidth = naturalSize.width; renderHeight = naturalSize.height; } else{ renderWidth = ceil(videoSize.width); renderHeight = ceil(videoSize.height); } mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight); mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction]; mainCompositionInst.frameDuration = CMTimeMake(1, 30); // 4 - 输出路径 NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent:[NSString stringWithFormat:@"%@.mp4",fileName]]; unlink([myPathDocs UTF8String]); NSURL* videoUrl = [NSURL fileURLWithPath:myPathDocs]; // dlink = [CADisplayLink displayLinkWithTarget:self selector:@selector(updateProgress)]; // [dlink setFrameInterval:15]; // [dlink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSDefaultRunLoopMode]; // [dlink setPaused:NO]; // 5 - 视频文件输出 exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality]; exporter.outputURL=videoUrl; exporter.outputFileType = AVFileTypeMPEG4; exporter.shouldOptimizeForNetworkUse = YES; exporter.videoComposition = mainCompositionInst; [exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ //这里是输出视频之后的操作,做你想做的 [self cropExportDidFinish:exporter]; }); }]; } - (void)cropExportDidFinish:(AVAssetExportSession*)session { if (session.status == AVAssetExportSessionStatusCompleted) { NSURL *outputURL = session.outputURL; dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.2 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{ [SVProgressHUD dismiss]; __block PHObjectPlaceholder *placeholder; if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(outputURL.path)) { NSError *error; [[PHPhotoLibrary sharedPhotoLibrary] performChangesAndWait:^{ PHAssetChangeRequest* createAssetRequest = [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:outputURL]; placeholder = [createAssetRequest placeholderForCreatedAsset]; } error:&error]; if (error) { [SVProgressHUD showErrorWithStatus:[NSString stringWithFormat:@"%@",error]]; } else{ [SVProgressHUD showSuccessWithStatus:@"视频已经保存到相册"]; } }else { [SVProgressHUD showErrorWithStatus:NSLocalizedString(@"视频保存相册失败,请设置软件读取相册权限", nil)]; } }); } else{ NSLog(@"%@",session.error); [SVProgressHUD showErrorWithStatus:NSLocalizedString(@"裁剪失败", nil)]; } } // 将JSONDATA转化为字典或者数组 - (id)DataToArrayOrNSDictionary:(NSData *)jsonData{ NSError *error = nil; id jsonObject = [NSJSONSerialization JSONObjectWithData:jsonData options:NSJSONReadingAllowFragments error:&error]; if (jsonObject != nil && error == nil){ return jsonObject; }else{ // 解析错误 return nil; } } // 将JSON串转化为字典或者数组 - (id)StrToArrayOrNSDictionary:(NSString *)jsonStr { NSData *jsonData = [jsonStr dataUsingEncoding:NSUTF8StringEncoding]; return [self DataToArrayOrNSDictionary:jsonData]; }
调用的时候,直接使用
-(void)cropImage{ NSURL *videoPath = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"selfS" ofType:@"MOV"]]; [self saveVideoPath:videoPath withStartTime:0.1 withEndTime:0 withSize:CGSizeMake(300, 300) withVideoDealPoint:CGPointMake(50, 50) WithFileName:@"cropVideo" shouldScale:YES]; }
其实对于一般的视频裁剪,只需要使用下面这个函数即可
- (void)goSaveVideoPath:(NSURL*)videoPath withStartTime:(float)startTime withEndTime:(float)endTime withSize:(CGSize)videoSize withVideoDealPoint:(CGPoint)videoDealPoint WithFileName:(NSString*)fileName shouldScale:(BOOL)shouldScale isWxVideoAssetvertical:(BOOL)Assetvertical
1.1、微信的处理
但是在实际使用过程中发现,单独使用这个函数中的AVFoundation处理裁剪视频的时候,对微信的支持并不好,如果使用微信自带的相机拍摄那个十秒视频虽然可以裁剪成功,但是是蓝屏的,只有声音没有画面,在打印的metadata信息中对比发现
WX拍摄 2017-05-11 19:35:58.529751+0800 JianLiXiu[9592:2774766] ======metadata:uiso/dscp,{ dataType = 2; dataTypeNamespace = "com.apple.quicktime.udta"; },{"WXVer":369428256},com.apple.metadata.datatype.UTF-8 2017-05-11 19:35:58.531034+0800 JianLiXiu[9592:2774766] ======commonMetadata:uiso/dscp,{ dataType = 2; dataTypeNamespace = "com.apple.quicktime.udta"; },{"WXVer":369428256},com.apple.metadata.datatype.UTF-8 WX下载 ======metadata:uiso/dscp,{ dataType = 2; dataTypeNamespace = "com.apple.quicktime.udta"; },{"WXVer":369428256},com.apple.metadata.datatype.UTF-8 ======commonMetadata:uiso/dscp,{ dataType = 2; dataTypeNamespace = "com.apple.quicktime.udta"; },{"WXVer":369428256},com.apple.metadata.datatype.UTF-8 自带拍摄 2017-05-11 19:35:09.708010+0800 JianLiXiu[9592:2774253] ======metadata:uiso/loci,{ dataType = 2; dataTypeNamespace = "com.apple.quicktime.udta"; },+31.1711+121.3836+045.636/,com.apple.metadata.datatype.UTF-8 2017-05-11 19:35:09.710245+0800 JianLiXiu[9592:2774253] ======metadata:uiso/date,{ dataType = 0; dataTypeNamespace = "com.apple.quicktime.udta"; },2017-05-11T17:40:41+0800,com.apple.metadata.datatype.raw-data 2017-05-11 19:35:09.712193+0800 JianLiXiu[9592:2774253] ======commonMetadata:uiso/date,{ dataType = 0; dataTypeNamespace = "com.apple.quicktime.udta"; },2017-05-11T17:40:41+0800,com.apple.metadata.datatype.raw-data 2017-05-11 19:35:09.712746+0800 JianLiXiu[9592:2774253] ======commonMetadata:uiso/loci,{ dataType = 2; dataTypeNamespace = "com.apple.quicktime.udta"; },+31.1711+121.3836+045.636/,com.apple.metadata.datatype.UTF-8
所以通过视频的metadata信息来判断是不是微信的视频,通过metadata是否含有{"WXVer":369428256}这个来判断是不是微信处理过的视频,如果是微信的视频先使用GPUImage重新渲染编码一次然后再处理。
代码中的
CGAffineTransform videoTransform = videoAssetTrack.preferredTransform;
是代表的视频的拍摄方向,在正常拍摄的时候打印如下
自带竖 0,1,-1,0 自带横 1,0,0,1 qq竖 1,0,0,1 qq横 0,-1,1,0 微信竖 0,1,-1,0 微信横 1,0,0,1
但微信的视频在GPUImage重新录制之后,拍摄方向变成了
微信竖 1,0,0,1 微信横 1,0,0,1
也就是说不管横竖,全部变成了一样的,这样就导致横屏拍摄的时候裁剪正常,竖屏拍摄的时候,裁剪之后出来的视频时错误的,所以要先判断原视频是什么方向,之后在裁剪处理的时候去处理。
1.2、裁剪位置的选择
int64_t renderWidth = 0, renderHeight = 0;
是要渲染输出的视频大小是多少,所以如果想设置指定大小,那就在这个设置,输出的大小就可以了,但是这个裁剪是默认从(0,0)开始的,如果是从中间开始的就需要设置videolayerInstruction这个的移动了。
videolayerInstruction是这个视频轨的动画等移动效果,视频的移动,翻转,缩小等都在这个上面进行操作。通过设置CGAffineTransform来达到裁剪位置
x会跟着c的值进行拉伸(View的宽度是跟着改变),y会跟着b的值进行拉伸(View的高度跟着改变),要注意到的是c和b的值改变不会影响到View的point(center中心点)的改变。这是个很有意思的两个参数。
x会跟着t.x进行x做表平移,y会跟着t.y进行平移。这里的point(center)是跟着变换的。
下面是Apple整合的transform
平移 :
①根据本身的transform进行平移 CGAffineTransformMakeTranslation(CGFloat tx,CGFloat ty)
②根据本身的transform后者另外的transform进行平移CGAffineTransformTranslate(CGAffineTransform t,CGFloat tx,CGFloat ty)
缩放 :
①根据本身的transform进行缩放
CGAffineTransformMakeScale(CGFloat sx,CGFloat sy)
②根据本身的transform后者另外的transform进行缩放
CGAffineTransformScale(CGAffineTransform t,CGFloat sx,CGFloat sy)
旋转 :
① 根据本身的transform进行旋转
CGAffineTransformMakeRotation(CGFloat angle) (angle 旋转的角度)
②根据本身的transform后者另外的transform进行旋转
CGAffineTransformRotate(CGAffineTransform t,CGFloat angle)
所以结合渲染的大小和要移动的效果,就可以知道是从哪个点开始裁剪,裁剪多大的视频了。
1.3、裁剪时间的选择
裁剪时间主要是在视频轨和音轨编辑的时候,设置插入的时长,从而控制裁剪时间。
//把视频轨道数据加入到可变轨道中 这部分可以做视频裁剪TimeRange [videoTrack insertTimeRange:CMTimeRangeMake(startCropTime, endCropTime) ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] lastObject] atTime:kCMTimeZero error:&error];
1.4、对没有声音视频的处理,比如延时视频
上篇说了如果没有声音的视频在处理的时候会错误,所以在裁剪的时候也要注意是否含有声音,如果没有声音就不要添加音轨了
//有声音 if ([[videoAsset tracksWithMediaType:AVMediaTypeAudio] count] > 0){ //声音采集 AVURLAsset * audioAsset = [[AVURLAsset alloc] initWithURL:videoPath options:opts]; //音频通道 AVMutableCompositionTrack * audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; //音频采集通道 AVAssetTrack * audioAssetTrack = [[audioAsset tracksWithMediaType:AVMediaTypeAudio] lastObject]; [audioTrack insertTimeRange:CMTimeRangeMake(startCropTime, endCropTime) ofTrack:audioAssetTrack atTime:kCMTimeZero error:nil]; }
1.5、原视频大小的处理
如果不裁剪视频的大小,只是默认大小裁剪时间的时候,发现获取的naturalSize在竖屏的时候是错误的,比如一个视频720*1280,结果获得的naturalSize是1280*720,横竖刚好相反,所以需要对竖屏的大小进行处理一下,这样就是正确的了
if (isVideoAssetvertical || Assetvertical) { originVideoSize = CGSizeMake([[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].height, [[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width); } else{ originVideoSize = CGSizeMake([[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].width, [[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] naturalSize].height); }
二、视频拼接、音视频的处理
视频的拼接和音视频的拼接是一样的,都是处理视频轨和音轨的开始时间,如果第二个视频的开始时间是第一个视频的结束时间,那么就是两段视频的拼接,如果开始时间相同,那么就是两个视频在混合了。
下面给出一个两个视频拼接,再加上一个背景音乐的例子。
-(void)addFirstVideo:(NSURL*)firstVideoPath andSecondVideo:(NSURL*)secondVideo withMusic:(NSURL*)musicPath{ [SVProgressHUD showWithStatus:@"正在合成到系统相册中"]; AVAsset *firstAsset = [AVAsset assetWithURL:firstVideoPath]; AVAsset *secondAsset = [AVAsset assetWithURL:secondVideo]; AVAsset *musciAsset = [AVAsset assetWithURL:musicPath]; // 1 - Create AVMutableComposition object. This object will hold your AVMutableCompositionTrack instances. AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init]; // 2 - Video track AVMutableCompositionTrack *firstTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, firstAsset.duration) ofTrack:[[firstAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:kCMTimeZero error:nil]; [firstTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, secondAsset.duration) ofTrack:[[secondAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:firstAsset.duration error:nil]; if (musciAsset!=nil){ AVMutableCompositionTrack *AudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; [AudioTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, CMTimeAdd(firstAsset.duration, secondAsset.duration)) ofTrack:[[musciAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:kCMTimeZero error:nil]; } // 4 - Get path NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); NSString *documentsDirectory = [paths objectAtIndex:0]; NSString *myPathDocs = [documentsDirectory stringByAppendingPathComponent: [NSString stringWithFormat:@"mergeVideo-%d.mov",arc4random() % 1000]]; NSURL *url = [NSURL fileURLWithPath:myPathDocs]; // 5 - Create exporter exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality]; exporter.outputURL=url; exporter.outputFileType = AVFileTypeQuickTimeMovie; exporter.shouldOptimizeForNetworkUse = YES; [exporter exportAsynchronouslyWithCompletionHandler:^{ dispatch_async(dispatch_get_main_queue(), ^{ [self exportDidFinish:exporter]; }); }]; } - (void)exportDidFinish:(AVAssetExportSession*)session { if (session.status == AVAssetExportSessionStatusCompleted) { NSURL *outputURL = session.outputURL; dispatch_after(dispatch_time(DISPATCH_TIME_NOW, (int64_t)(0.2 * NSEC_PER_SEC)), dispatch_get_main_queue(), ^{ [SVProgressHUD dismiss]; __block PHObjectPlaceholder *placeholder; if (UIVideoAtPathIsCompatibleWithSavedPhotosAlbum(outputURL.path)) { NSError *error; [[PHPhotoLibrary sharedPhotoLibrary] performChangesAndWait:^{ PHAssetChangeRequest* createAssetRequest = [PHAssetChangeRequest creationRequestForAssetFromVideoAtFileURL:outputURL]; placeholder = [createAssetRequest placeholderForCreatedAsset]; } error:&error]; if (error) { [SVProgressHUD showErrorWithStatus:[NSString stringWithFormat:@"%@",error]]; } else{ [SVProgressHUD showSuccessWithStatus:@"视频已经保存到相册"]; } }else { [SVProgressHUD showErrorWithStatus:NSLocalizedString(@"视频保存相册失败,请设置软件读取相册权限", nil)]; } }); } }
调用的时候
-(void)addMusic{ NSURL *videoPath1 = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"selfS" ofType:@"MOV"]]; NSURL *videoPath2 = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"selfH" ofType:@"MOV"]]; NSURL *music = [NSURL fileURLWithPath:[[NSBundle mainBundle] pathForResource:@"music" ofType:@"mp3"]]; [self addFirstVideo:videoPath1 andSecondVideo:videoPath2 withMusic:music]; }
这样就是videopath1和videopath2拼接,然后加上一个背景音乐music,之后生成一个视频保存到相册。
(2017-08-14)2.1、音频音量的调节
最近需要把背景音乐的音量调小,当然背景音乐文件音量调小即可,但是毕竟每个文件处理都不太方便,所以还是用代码把指定音轨的音量减小,这里只减小背景音乐的音量,不减小其他的音量,所以用到了AVMutableAudioMix这个。
在原工程增加以下代码即可
//修改背景音乐的音量start AVMutableAudioMix *videoAudioMixTools = [AVMutableAudioMix audioMix]; if (musciAsset) { //调节音量 //获取音频轨道 AVMutableAudioMixInputParameters *firstAudioParam = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:AudioTrack]; //设置音轨音量,可以设置渐变,设置为1.0就是全音量 [firstAudioParam setVolumeRampFromStartVolume:1.0 toEndVolume:1.0 timeRange:CMTimeRangeMake(kCMTimeZero, CMTimeAdd(firstAsset.duration, secondAsset.duration))]; [firstAudioParam setTrackID:AudioTrack.trackID]; videoAudioMixTools.inputParameters = [NSArray arrayWithObject:firstAudioParam]; } //end exporter.audioMix = videoAudioMixTools;
下一篇: 银耳汤要煮多久