1010cc时时彩标准版 > 三分时时彩1010CC > 1010cc时时彩标准版:短视频录制,视频添加音频

原标题:1010cc时时彩标准版:短视频录制,视频添加音频

浏览次数:135 时间:2019-09-12

核心代码

视频处理主要是用到以下这几个类

视频处理主要是用到以下这几个类

给你的视频打上个人标签

1010cc时时彩标准版 1

编辑前

1010cc时时彩标准版 2

编辑后

话不多说,先上代码:

CommonEditVideo.h

//
//  CommonEditVideo.h
//  MediumEdit
//
//  Created by Input on 2016/9/23.
//  Copyright © 2016年 Input. All rights reserved.
//
/**
 * 使用时请在info.plist文件加入key: NSPhotoLibraryUsageDescription value: 打开相册
 * 水印格式: 自定义字符   时间   机型
 *
 */

#import <UIKit/UIColor.h>
#import <AVFoundation/AVFoundation.h>

@protocol  CommonEditVideoDelegate <NSObject>

@required

//导出到文件结束, 暂时没有加状态处理
- (void)didExport:(nullable NSURL *) url error:(nullable NSError *) error;

//返回是否导出到文件, 不希望文件加水印只是临时添加请返回NO
- (BOOL)willExport:(nullable AVAssetExportSession *) exporter;

//开始编辑
- (void)willEdit;

@end

/**
 * 水印位置
 */

typedef enum : NSUInteger {
    AlignmentUp,        //居上
    AlignmentCenter,    //居中
    AlignmentDown,      //居下
} Alignment;


@interface CommonEditVideo: NSObject

@property (nonatomic, nullable, weak)   id<CommonEditVideoDelegate>     delegate;           //视频编辑相关代理

@property (nonatomic, nullable, strong) NSString                        *watermarkTitle;    //水印内容
@property (nonatomic, nullable, strong) NSURL                           *url;               //图片输出位置(默认为相册)

@property Alignment     watermarkPlace;     //水印位置
@property BOOL          isShowTime;         //是否显示时间(YES)
@property BOOL          isShowModel;        //是否显示拍摄机型(YES)


- (nullable instancetype)init;
- (void)startEditVideo:(nonnull NSURL *) assetURL;
- (void)videoOutput;

@end

CommonEditVideo.m

//
//  CommonEditVideo.m
//  MediumEdit
//
//  Created by Input on 2016/9/23.
//  Copyright © 2016年 Input. All rights reserved.
//

#import "CommonEditVideo.h"
#import <MobileCoreServices/UTCoreTypes.h>
#import <AssetsLibrary/ALAssetsLibrary.h>

@interface CommonEditVideo ()

@property (nonatomic, nullable, strong) AVAssetExportSession            *exporter;          //视频导出
@property (nonatomic, nullable, strong) AVAsset                         *videoAsset;        //视频资源

@end

@implementation CommonEditVideo

- (nullable instancetype)init{
    self = [super init];
    if (self){
        self.isShowTime     = YES;
        self.isShowModel    = YES;
        self.watermarkPlace = AlignmentCenter;
    }
    return self;
}

- (void)startEditVideo:(nonnull NSURL *) assetURL{
    self.videoAsset = [[AVURLAsset alloc] initWithURL: assetURL options: nil];

    if (self.delegate){
        [self.delegate willEdit];
    }

    //初始化及构造新视频
    [self exporterInitialize];
}

//导出结果回调
- (void)exportDidFinish{

    if (self.exporter.status == AVAssetExportSessionStatusCompleted) {
        NSURL *outputURL = self.exporter.outputURL;
        ALAssetsLibrary *library = [[ALAssetsLibrary alloc] init];
        if ([library videoAtPathIsCompatibleWithSavedPhotosAlbum:outputURL]) {

            [library writeVideoAtPathToSavedPhotosAlbum:outputURL completionBlock:^(NSURL *assetURL, NSError *error){

                dispatch_async(dispatch_get_main_queue(), ^{

                    if (self.delegate){
                        [self.delegate didExport: self.exporter.outputURL error: error];
                    }

                });
            }];
        }
    }

}

- (void)editWatermarkTitle{
    //水印相关设定
    NSString *modelStr = nil;

    for (int i = 0;  i < self.videoAsset.commonMetadata.count; i   ){
        AVMetadataItem *data = self.videoAsset.commonMetadata[i];

//        NSLog(@"%@",data);

        if ([data.commonKey isEqual: @"creationDate"]){
            if (self.isShowTime){
                NSDateFormatter *timeFormatter = [NSDateFormatter new];
                timeFormatter.dateFormat = @"' ' yyyy-MM-dd/HH:mm";
                self.watermarkTitle = [self.watermarkTitle stringByAppendingString:[timeFormatter stringFromDate:data.dateValue]];
            }
        };

        if ([data.commonKey isEqual: @"model"]){
            if (self.isShowModel){
                modelStr =  @" ";
                modelStr = [modelStr stringByAppendingString: data.stringValue];
            }
        };

    };

    self.watermarkTitle = [self.watermarkTitle stringByAppendingString:modelStr];

}

- (void)applyVideoEffectsToComposition:(nonnull AVMutableVideoComposition *)composition size:(CGSize)size{


    [self editWatermarkTitle];

    CATextLayer *subtitle1Text = [[CATextLayer alloc] init];

    [subtitle1Text setFontSize:36];

    subtitle1Text.frame = [self CGRectFromWatermarkPlaceAndVideoSzie:size];
    [subtitle1Text setString:self.watermarkTitle];
    [subtitle1Text setAlignmentMode:kCAAlignmentCenter];
    [subtitle1Text setForegroundColor:[[UIColor redColor] CGColor]];

    CALayer *overlayLayer = [CALayer layer];
    [overlayLayer addSublayer:subtitle1Text];

    overlayLayer.frame = CGRectMake(0, 0, size.width, size.height);
    [overlayLayer setMasksToBounds:YES];

    CALayer *parentLayer = [CALayer layer];
    CALayer *videoLayer = [CALayer layer];
    parentLayer.frame = CGRectMake(0, 0, size.width, size.height);
    videoLayer.frame = CGRectMake(0, 0, size.width, size.height);
    [parentLayer addSublayer:videoLayer];
    [parentLayer addSublayer:overlayLayer];

    composition.animationTool = [AVVideoCompositionCoreAnimationTool
                                 videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer];

}

//返回水印位置
- (CGRect)CGRectFromWatermarkPlaceAndVideoSzie:(CGSize)size{

    switch (self.watermarkPlace) {
        case AlignmentUp:
            return CGRectMake(0, 0, size.width, size.height);
        case AlignmentDown:
            return CGRectMake(0, 0, size.width, 80);
        default:
            return CGRectMake(0, 0, size.width, size.height / 2);
    }
}

- (void)exporterInitialize{

    AVMutableComposition *mixComposition = [[AVMutableComposition alloc] init];

    AVMutableCompositionTrack *videoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    [videoTrack insertTimeRange:CMTimeRangeMake(kCMTimeZero, self.videoAsset.duration)
                        ofTrack:[[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]
                         atTime:kCMTimeZero error:nil];
    AVMutableCompositionTrack *audioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio
                                                                        preferredTrackID:kCMPersistentTrackID_Invalid];
    [audioTrack insertTimeRange: CMTimeRangeMake(kCMTimeZero, self.videoAsset.duration)
                        ofTrack:[[self.videoAsset tracksWithMediaType:AVMediaTypeAudio]
                                 objectAtIndex:0]
                         atTime:kCMTimeZero error:nil];

    self.exporter = [[AVAssetExportSession alloc] initWithAsset:mixComposition
                                                 presetName:AVAssetExportPresetHighestQuality];
    AVMutableVideoCompositionInstruction *mainInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction];
    mainInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, self.videoAsset.duration);

    AVMutableVideoCompositionLayerInstruction *videolayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack];
    AVAssetTrack *videoAssetTrack = [[self.videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0];

    mainInstruction.layerInstructions = [NSArray arrayWithObjects:videolayerInstruction,nil];
    AVMutableVideoComposition *mainCompositionInst = [AVMutableVideoComposition videoComposition];
    CGSize naturalSize;

    naturalSize = videoAssetTrack.naturalSize;

    float renderWidth, renderHeight;
    renderWidth = naturalSize.width;
    renderHeight = naturalSize.height;
    mainCompositionInst.renderSize = CGSizeMake(renderWidth, renderHeight);
    mainCompositionInst.instructions = [NSArray arrayWithObject:mainInstruction];
    mainCompositionInst.frameDuration = CMTimeMake(1, 30);

    [self applyVideoEffectsToComposition:mainCompositionInst size:naturalSize];

    self.exporter.videoComposition = mainCompositionInst;
    self.exporter.outputFileType = AVFileTypeQuickTimeMovie;
    self.exporter.shouldOptimizeForNetworkUse = YES;
    self.exporter.metadata = self.videoAsset.metadata;

}

- (void)videoOutput{

    NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
    NSString *documentsDirectory = [paths objectAtIndex:0];
    NSString *myPathDocs =  [documentsDirectory stringByAppendingPathComponent:
                            [NSString stringWithFormat:@"(Input)FinalVideo-%d.mov",arc4random() % 1000]];
    NSURL *url = [NSURL fileURLWithPath:myPathDocs];

    BOOL flag;

    if (self.delegate){
        flag = [self.delegate willExport:self.exporter];
    }else{
        flag = YES;
    }

    //视频输出url设置
    if (!self.url){
        self.exporter.outputURL = url;
    }else{
        self.exporter.outputURL = self.url;
    }
    //导出
    if (flag){
        [self.exporter exportAsynchronouslyWithCompletionHandler:^{

            dispatch_async(dispatch_get_main_queue(), ^{
                [self exportDidFinish];
            });
        }];
    }
}

@end

简单介绍

1,创建需要处理的视频素材,

  1. AVMutableComposition
  2. AVMutableVideoComposition
  3. AVMutableAudioMix
  4. AVMutableVideoCompositionInstruction
  5. AVMutableVideoCompositionLayerInstruction
  6. AVAssetExportSession

AVMutableComposition、AVMutableVideoComposition、AVMutableAudioMix、AVMutableVideoCompositionInstruction、AVMutableVideoCompositionLayerInstruction、AVAssetExportSession 等。其中 AVMutableComposition 可以用来操作音频和视频的组合,AVMutableVideoComposition 可以用来对视频进行操作,AVMutableAudioMix 类是给视频添加音频的,AVMutableVideoCompositionInstruction和AVMutableVideoCompositionLayerInstruction 一般都是配合使用,用来给视频添加水印或者旋转视频方向,AVAssetExportSession 是用来进行视频导出操作的。需要值得注意的是当App进入后台之后,会对使用到GPU的代码操作进行限制,会造成崩溃,而视频处理这些功能多数会使用到GPU,所以需要做对应的防错处理。

解说

​ 首先创建一个 CommonEditVideo对象,使用这个对像得实现CommonEditVideoDelegate协议;

CommonEditVideoDelegate协议有三个方法,分别在开始编辑视频及保存到文件前后调用,其中实现willExport返回一个真假值,如果是想在放器界面添加水印在这里返回一个NO即可,参数就是已经处理好的AVAssetExportSession,这个时候你可以直接拿AVAssetExportSession对象的asset播放了;如果是要将视频永久保存则返回一个YES,当视频导出到文件结束将会调用didExport,这个函数会给你一个导出状态及地址,默认保存到相册,你也可以设置CommonEditVideo对象的url值。

​ 水印可以放在三个位置:上中下;默认居中。可以设置是否显示时间及拍摄机型。自定义字符放在水印最前端。

下面的代码就是我测试用的视图控制器的实现部分

ViewController.m

//
//  ViewController.m
//  MediumEdit
//
//  Created by Input on 2016/9/23.
//  Copyright © 2016年 Input. All rights reserved.
//

#import "ViewController.h"
#import <CoreLocation/CoreLocation.h>
#import <ImageIO/ImageIO.h>
#import <MediaPlayer/MPMoviePlayerViewController.h>
#import <MobileCoreServices/UTCoreTypes.h>


@interface ViewController ()

@end

@implementation ViewController 

- (void)viewDidLoad {
    [super viewDidLoad];
    UIButton *btn = [UIButton buttonWithType: UIButtonTypeSystem];

    btn.bounds = CGRectMake(0, 0, 100, 40);
    btn.center = self.view.center;
    btn.backgroundColor = [UIColor cyanColor];
    btn.layer.cornerRadius = 5;
    [btn addTarget:self action:@selector(didBtn:) forControlEvents:UIControlEventTouchUpInside];
    [self.view addSubview:btn];
}


- (void)didBtn:(UIButton *)sender{

    UIImagePickerController *mediaUI = [[UIImagePickerController alloc] init];
    mediaUI.sourceType = UIImagePickerControllerSourceTypeSavedPhotosAlbum;
    mediaUI.mediaTypes = [[NSArray alloc] initWithObjects: (NSString *) kUTTypeMovie, nil];

    mediaUI.allowsEditing = YES;
    mediaUI.delegate = self;

    [self presentViewController:mediaUI animated:YES completion:nil];
}

#pragma mark - UIImagePickerControllerDelegate

- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {

    [self dismissViewControllerAnimated:YES completion:nil];

    CommonEditVideo *videoEdit = [[CommonEditVideo alloc]init];
    videoEdit.delegate = self;
    videoEdit.watermarkTitle = @"input";
    videoEdit.watermarkPlace = AlignmentDown;

    //开始编辑
    [videoEdit startEditVideo: [info objectForKey:UIImagePickerControllerMediaURL]];

    //输出视频
    [videoEdit videoOutput];

}

- (void)imagePickerControllerDidCancel:(UIImagePickerController *)picker {
    [self dismissViewControllerAnimated:YES completion:nil];
}

#pragma mark - CommonEditVideoDelegate

- (void)willEdit{
    NSLog(@"%@", @"开始编辑");
}
- (BOOL)willExport:(AVAssetExportSession *)exporter{
    NSLog(@"%@", @"编辑完成n开始导出到文件");

//    NSLog(@"%lld",exporter.asset.duration.value);
    return YES;
}
- (void)didExport:(NSURL *)url error:(NSError *)error{

    if (!error){
        NSLog(@"%@", url);

        MPMoviePlayerViewController *playerCtr = [[MPMoviePlayerViewController alloc]initWithContentURL:url];
        [self presentViewController:playerCtr animated:YES completion:nil];
    }
}

@end

测试时间不长,也只试过系统自带的相机拍摄的视频处理,如果使用途中遇到BUG请留言,大家一起探讨。

这篇文章是自己做短视频功能这几个月的一个总结,以下文字或代码有任何不妥的地方希望各位积极指出错误,并给出建议。

#pragma mark - 视频编辑-videoEdit{ //1,将素材拖入到素材库 AVAsset *asset = [AVAsset assetWithURL:self.videoUrl]; //素材的视频轨 AVAssetTrack *videoAssetTrack = [[asset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0]; //素材的音频轨 AVAssetTrack *audioAssetTrack = [[asset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0]; // Trim to half duration// double halfDuration = CMTimeGetSeconds([asset duration]) - 5; CMTime trimmedDuration = CMTimeSubtract(_videoView.newEndTime, _videoView.newStartTime); CMTimeShow(trimmedDuration); //2,将素材的视频插入视频轨,音频插入音频轨 //这是工程文件 self.composition = [AVMutableComposition composition]; //视频轨道 AVMutableCompositionTrack *videoCompositionTrack = [self.composition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid]; //在视频轨道插入一个时间段的视频 [videoCompositionTrack insertTimeRange:CMTimeRangeMake(_videoView.newStartTime, trimmedDuration) ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil]; //音频轨道 AVMutableCompositionTrack *audioCompositionTrack = [self.composition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid]; //插入音频数据,否则没声音 [audioCompositionTrack insertTimeRange:CMTimeRangeMake(_videoView.newStartTime,trimmedDuration) ofTrack:audioAssetTrack atTime:kCMTimeZero error:nil]; //可以添加其他音视频// [videoCompositionTrack insertTimeRange:CMTimeRangeMake(_newStartTime, trimmedDuration) ofTrack:videoAssetTrack atTime:kCMTimeZero error:nil]; // //3,裁剪视频 //AVMutableVideoComposition:管理所有视频轨道,可以决定最终视频的尺寸,裁剪需要在这里进行 self.videoComposition = [AVMutableVideoComposition videoComposition]; self.videoComposition.frameDuration = CMTimeMake; self.videoComposition.renderSize = videoAssetTrack.naturalSize; // AVMutableVideoCompositionInstruction 视频轨道中的一个视频,可以缩放、旋转等 AVMutableVideoCompositionInstruction *instruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; instruction.timeRange = CMTimeRangeMake(kCMTimeZero, trimmedDuration); // 3.2 AVMutableVideoCompositionLayerInstruction 一个视频轨道,包含了这个轨道上的所有视频素材 AVAssetTrack *videoTrack = [self.composition tracksWithMediaType:AVMediaTypeVideo][0]; AVMutableVideoCompositionLayerInstruction *layerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:videoTrack]; //视频旋转处理 if (self.isRotate) { CGAffineTransform t1 = CGAffineTransformMakeTranslation(videoCompositionTrack.naturalSize.height, 0.0); // Rotate transformation CGAffineTransform t2 = CGAffineTransformRotate(t1, degreesToRadians; [layerInstruction setTransform:t2 atTime:kCMTimeZero]; self.videoComposition.renderSize = CGSizeMake(videoAssetTrack.naturalSize.height, videoAssetTrack.naturalSize.width); } // 3.3 - Add instructions instruction.layerInstructions = [NSArray arrayWithObjects:layerInstruction,nil]; self.videoComposition.instructions = [NSArray arrayWithObject:instruction]; //添加水印 重新刷新player的时候会重置 所以在导出水印的时候添加水印}

其中 AVMutableComposition 可以用来操作音频和视频的组合,AVMutableVideoComposition 可以用来对视频进行操作,AVMutableAudioMix 类是给视频添加音频的,AVMutableVideoCompositionInstruction和AVMutableVideoCompositionLayerInstruction 一般都是配合使用,用来给视频添加水印或者旋转视频方向,AVAssetExportSession 是用来进行视频导出操作的。需要值得注意的是当App进入后台之后,会对使用到GPU的代码操作进行限制,会造成崩溃,而视频处理这些功能多数会使用到GPU,所以需要做对应的防错处理。在这里我会使用Apple的官方Demo "AVSimpleEditoriOS" 作为讲解案例,该案例采用Command设计模式来组织代码,其中基类的AVSECommand包含了一些各个子类Command共用的属性。本文就视频相关操作做简要介绍,说明一些相关的操作,并标注一些重点代码,希望本文可以起到抛砖引玉的效果,让大家对视频剪辑处理有个初步印象,然后可以根据Apple官方Demo的内容进行相应的修改。大家可以下载相应的Apple官方Demo运行查看结果。

在这里我会使用Apple的官方Demo “AVSimpleEditoriOS” 作为讲解案例,该案例采用Command设计模式来组织代码,其中基类的AVSECommand包含了一些各个子类Command共用的属性。本文就视频相关操作做简要介绍,说明一些相关的操作,并标注一些重点代码,希望本文可以起到抛砖引玉的效果,让大家对视频剪辑处理有个初步印象,然后可以根据Apple官方Demo的内容进行相应的修改。大家可以下载相应的Apple官方Demo运行查看结果。

从短视频拍摄基本功能来说,我想应该有以下,美颜,水印,断点拍摄,编辑功能,编辑功能包含,滤镜添加,背景音乐添加,贴图贴纸添加,高质量压缩(视频降低码率)。针对这些功能,我会逐一介绍其实现方法。

2,视频旋转,添加水印视频处理后需要刷新 player,刷新前必须调用 self.videoComposition.animationTool = NULL;会把水印清空,所以添加水印要在导出前添加,显示在palyerLayer上的水印为"假水印"

@property AVMutableComposition *mutableComposition;@property AVMutableVideoComposition *mutableVideoComposition;@property AVMutableAudioMix *mutableAudioMix;@property CALayer *watermarkLayer;

第一节:给视频添加水印和背景边框

使用到的第三方库:TZImagePickerController,GPUImage,SDAVAssetExportSession

 if (self.isAddWaterMark) { CGSize videoSize = self.videoComposition.renderSize; self.watermarkLayer = [self watermarkLayerForSize:videoSize]; CALayer *exportWatermarkLayer = [self copyWatermarkLayer:self.watermarkLayer]; CALayer *parentLayer = [CALayer layer]; CALayer *videoLayer = [CALayer layer]; parentLayer.frame = CGRectMake(0, 0, self.videoComposition.renderSize.width, self.videoComposition.renderSize.height); videoLayer.frame = CGRectMake(0, 0, self.videoComposition.renderSize.width, self.videoComposition.renderSize.height); [parentLayer addSublayer:videoLayer]; exportWatermarkLayer.position = CGPointMake(self.videoComposition.renderSize.width/2, self.videoComposition.renderSize.height/4); [parentLayer addSublayer:exportWatermarkLayer]; CABasicAnimation *anima = [CABasicAnimation animationWithKeyPath:@"opacity"]; anima.fromValue = [NSNumber numberWithFloat:1.0f]; anima.toValue = [NSNumber numberWithFloat:0.0f]; anima.repeatCount = 0; anima.duration = 5.0f; //5s之后消失 [anima setRemovedOnCompletion:NO]; [anima setFillMode:kCAFillModeForwards]; anima.beginTime = AVCoreAnimationBeginTimeAtZero; [exportWatermarkLayer addAnimation:anima forKey:@"opacityAniamtion"]; self.videoComposition.animationTool = [AVVideoCompositionCoreAnimationTool videoCompositionCoreAnimationToolWithPostProcessingAsVideoLayer:videoLayer inLayer:parentLayer]; }

1010cc时时彩标准版 3Command设计模式1010cc时时彩标准版 4视频头部剪切代码图

今天第一节先讲解如何为一个视频添加边框和动画,首先说明的是,这种边框和动画并不能直接修改视频的某一帧给他增加边框或者产生动画效果,这种动画更像是给视频的上面加一个calayer,然后控制这个layer产生动画效果。因为具体到某一帧的这种操作不是iphone应该做的他也做不到。

demo效果图

3,导出视频

  1. 拿到视频和音频资源
  2. 创建AVMutableComposition对象
  3. 往AVMutableComposition对象添加视频资源,同时设置视频资源的时间段和插入点
  4. 往AVMutableComposition对象添加音频资源,同时设置音频资源的时间段和插入点

我们先来看一张图,了解一下给video增加动画的原理。

文章末尾会附上demo地址

 //保存至沙盒路径 [self creatSandBoxFilePathIfNoExist]; //保存至沙盒路径 NSString *pathDocuments = [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) objectAtIndex:0]; NSString *videoPath = [NSString stringWithFormat:@"%@/Video", pathDocuments]; NSString *urlPath = [videoPath stringByAppendingPathComponent:@"cyan.mp4"]; //先移除 NSFileManager *manager = [NSFileManager defaultManager]; [manager removeItemAtPath:urlPath error:nil]; // AVAssetExportPresetPassthrough AVAssetExportPresetHighestQuality self.exportSession = [[AVAssetExportSession alloc] initWithAsset:self.composition presetName:AVAssetExportPresetHighestQuality]; self.exportSession.videoComposition = self.videoComposition; self.exportSession.outputURL = [NSURL fileURLWithPath:urlPath]; self.exportSession.outputFileType = AVFileTypeMPEG4; // exporter.shouldOptimizeForNetworkUse = YES; [self.exportSession exportAsynchronouslyWithCompletionHandler:^{ int exportStatus = self.exportSession.status; NSLog(@"exportStatus : %d",exportStatus); switch (exportStatus) { case AVAssetExportSessionStatusFailed: { // log error to text view NSError *exportError = self.exportSession.error; NSLog (@"AVAssetExportSessionStatusFailed: %@", exportError); break; } case AVAssetExportSessionStatusCompleted: { //保存到相册 [self writeVideoToPhotoLibrary:[NSURL fileURLWithPath:urlPath]]; NSLog(@"视频转码成功"); } } }];

在查有关视频旋转的资料的时候查看了一篇很不错的参考资料,在这里给大家安利一下 讲的是矩阵运算的原理,对视图的矩阵操作的相关内容。

1010cc时时彩标准版 5

预览画面支持美颜切换,前后相机切换

读取解析视频帧图片,用来可视化裁剪视频,

1010cc时时彩标准版 6视频旋转代码图11010cc时时彩标准版 7视频旋转代码图21010cc时时彩标准版 8视频旋转代码图3

你可以看到videoLayer这个东西,其实这个layer就是负责显示我们的视频,和他同级的是一个叫animationLayer的东西,我们能够掌控并且玩出花样的其实是这个叫animationLayer的东西,因为这个animationLayer可以由我们自己创建。

1010cc时时彩标准版 9

#pragma mark - 读取解析视频帧图片-analysisVideoFrames{ AVURLAsset *videoAsset = [[AVURLAsset alloc]initWithURL:self.videoUrl options:nil]; //获取视频总长度 = 总帧数 / 每秒的帧数 long videoSumTime = videoAsset.duration.value / videoAsset.duration.timescale; //创建AVAssetImageGenerator对象 AVAssetImageGenerator *generator = [[AVAssetImageGenerator alloc]initWithAsset:videoAsset]; generator.maximumSize = self.bottomView.frame.size; generator.appliesPreferredTrackTransform = YES; generator.requestedTimeToleranceBefore = kCMTimeZero; generator.requestedTimeToleranceAfter = kCMTimeZero; //添加需要帧数的时间集合 self.framesArray = [NSMutableArray array]; for (NSInteger index = 0; index < videoSumTime; index   ) { CMTime time = CMTimeMake(index * videoAsset.duration.timescale, videoAsset.duration.timescale); NSValue *value = [NSValue valueWithCMTime:time]; [self.framesArray addObject:value]; } __block long count = 0 ; __weak typeofweakSelf = self; __block UIImage *showImage = [[UIImage alloc] init]; __block CGFloat showImageViewWitd = (self.bottomView.frame.size.width - self.leftView.frame.size.width * 2)/videoSumTime; [generator generateCGImagesAsynchronouslyForTimes:self.framesArray completionHandler:^(CMTime requestedTime, CGImageRef _Nullable image, CMTime actualTime, AVAssetImageGeneratorResult result, NSError * _Nullable error) { if (result == AVAssetImageGeneratorSucceeded) { showImage = [UIImage imageWithCGImage:image]; dispatch_async(dispatch_get_main_queue(), ^{ UIImageView *thumImgView = [[UIImageView alloc]initWithFrame:CGRectMake( 20   count * showImageViewWitd , 0, showImageViewWitd, 40)]; thumImgView.image = showImage; [weakSelf.showImageViewBgView addSubview:thumImgView]; count  ; }) ; } if (result == AVAssetImageGeneratorFailed) { NSLog(@"Failed with error: %@", [error localizedDescription]); } if (result == AVAssetImageGeneratorCancelled) { NSLog(@"AVAssetImageGeneratorCancelled"); } }]; }
  1. 拿到视频和音频资源
  2. 创建AVMutableComposition对象
  3. 往AVMutableComposition对象添加视频资源,同时设置视频资源的时间段和插入点
  4. 往AVMutableComposition对象添加音频资源,同时设置音频资源的时间段和插入点
  5. 设置旋转矩阵变换
  6. 创建AVMutableVideoComposition对象
  7. 设置视频的渲染宽高和Frame
  8. 创建视频组合指令AVMutableVideoCompositionInstruction,并设置指令在视频的作用时间范围和旋转矩阵变换
  9. 创建视频组合图层指令AVMutableVideoCompositionLayerInstruction,并设置图层指令在视频的作用时间范围和旋转矩阵变换
  10. 把视频图层指令放到视频指令中,再放入视频组合对象中

其实很简单,和我们videoLayer同级别的layer叫animationLayer(就是background),他们共同有个父类叫做parentLayer,那么增加边框无非是把animationLayer这个layer找个边框的图片,然后把他放到videoLayer的下面,然后把videoLayer(crop也就是裁剪)的尺寸控制到刚好能显示animationLayer的四边,这样,不就成了带边框的效果么。简单点讲就是 把裁剪好的视频 videoLayer放在了 背景上然后再加入到parentLayer里 。

5711487-e5b15cbaac9fd2ef.PNG

demo

这里给出不同旋转角度的案例代码,希望能够对你起帮助,节省你的时间。

视频添加水印和背景边框具体步骤

录制中

 CGAffineTransform translateToCenter; if (self.degrees != 0) { CGAffineTransform mixedTransform; if(self.degrees == 90){ //顺时针旋转90° NSLog(@"视频旋转90度,home按键在左"); translateToCenter = CGAffineTransformMakeTranslation(mixedVideoTrack.naturalSize.height,0.0); mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI_2); waterMarkVideoComposition.renderSize = CGSizeMake(mixedVideoTrack.naturalSize.height,mixedVideoTrack.naturalSize.width); }else if(self.degrees == 180){ //顺时针旋转180° NSLog(@"视频旋转180度,home按键在上"); translateToCenter = CGAffineTransformMakeTranslation(mixedVideoTrack.naturalSize.width, mixedVideoTrack.naturalSize.height); mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI); waterMarkVideoComposition.renderSize = CGSizeMake(mixedVideoTrack.naturalSize.width,mixedVideoTrack.naturalSize.height); }else if(self.degrees == 270){ //顺时针旋转270° NSLog(@"视频旋转270度,home按键在右"); translateToCenter = CGAffineTransformMakeTranslation(0.0, assetVideoTrack.naturalSize.width); mixedTransform = CGAffineTransformRotate(translateToCenter,M_PI_2*3.0); waterMarkVideoComposition.renderSize = CGSizeMake(mixedVideoTrack.naturalSize.height,mixedVideoTrack.naturalSize.width); } AVMutableVideoCompositionInstruction *roateInstruction = [AVMutableVideoCompositionInstruction videoCompositionInstruction]; roateInstruction.timeRange = CMTimeRangeMake(kCMTimeZero, [mixComposition duration]); AVMutableVideoCompositionLayerInstruction *roateLayerInstruction = [AVMutableVideoCompositionLayerInstruction videoCompositionLayerInstructionWithAssetTrack:mixedVideoTrack]; [roateLayerInstruction setTransform:mixedTransform atTime:kCMTimeZero]; roateInstruction.layerInstructions = @[roateLayerInstruction]; //将视频方向旋转加入到视频处理中 waterMarkVideoComposition.instructions = @[roateInstruction]; }

1.拿到视频和音频资源

1010cc时时彩标准版 10

1010cc时时彩标准版 11视频添加音频代码图11010cc时时彩标准版 12视频添加音频代码图2

2.创建AVMutableComposition对象

5711487-b8918b5001fef0f4.PNG

  1. 拿到视频和音频资源
  2. 创建AVMutableComposition对象
  3. 往AVMutableComposition对象添加视频资源,同时设置视频资源的时间段和插入点
  4. 往AVMutableComposition对象添加音频资源,同时设置音频资源的时间段和插入点
  5. 往AVMutableComposition对象添加要追加的音频资源,同时设置音频资源的时间段,插入点和混合模式

3.往AVMutableComposition对象添加视频资源,同时设置视频资源的时间段和插入点

暂停录制,支持断点回删

1010cc时时彩标准版 13视频添加水印11010cc时时彩标准版 14视频添加水印2

4.往AVMutableComposition对象添加音频资源,同时设置音频资源的时间段和插入点

1010cc时时彩标准版 15

  1. 拿到视频和音频资源
  2. 创建AVMutableComposition对象
  3. 往AVMutableComposition对象添加视频资源,同时设置视频资源的时间段和插入点
  4. 往AVMutableComposition对象添加音频资源,同时设置音频资源的时间段和插入点
  5. 创建视频组合器对象 AVMutableVideoComposition 并设置frame和渲染宽高
  6. 创建视频组合器指令对象,设置指令的作用范围
  7. 创建视频组合器图层指令对象,设置指令的作用范围
  8. 视频组合器图层指令对象 放入 视频组合器指令对象中
  9. 视频组合器指令对象放入视频组合器对象
  10. 创建水印图层Layer并设置frame和水印的位置,并将水印加入视频组合器中

5.创建视频组合器对象 AVMutableVideoComposition 并设置frame和渲染宽高

5711487-499df69972480db7.PNG

本文由1010cc时时彩标准版发布于三分时时彩1010CC,转载请注明出处:1010cc时时彩标准版:短视频录制,视频添加音频

关键词:

上一篇:1010cc时时彩标准版C性情埋点,无侵入埋点

下一篇:没有了