2013年9月16日 星期一

Unity3D研究院之IOS截屏 话筒录音 录制截屏视频保存沙盒(另类实现方法)(三十五)

http://www.xuanyusong.com/archives/1769

         两周没有更新博客了,MOMO最近超忙各种加班进行中。。IOS + Android同时开发,激情的日子继续着,蛤蛤。昨天有个朋友打电话告诉我说它们的U3D项目遇到点棘手的难题,他们想在Unity3D中增加截屏录像录音的功能,并且还要能导出保存成.mp4的格式。据我所知Unity3D是没有截屏录像的功能,只有截屏图片的功能。有一段时间没有研究Unity3D的东西了,一时心里痒痒我决定那就好好研究研究,功夫不负有心人终于让我研究出来如何在Unity3D结合IOS前端录制截屏视频的功能了。首先我说说我研究实现的原理。1.截取屏幕每帧的图片,将截取的N张图片组成一个没有声音的视频.mp4文件。2.同时还需要录制手机听筒中的声音保存为.caf格式。3.最终将没有声音的视频和音频组合成一个全新的视频文件,保存在沙盒中即可。
        因为他们的Unity3D项目比较特殊,可以认为是在Unity3D游戏引擎之上搭建的IOS软件项目。Unity3D只负责显示一个3D的模型,至于UI部分全部都是由IOS的前台的OC代码实现的。这样就造成一个问题,OC的代码截图只有UI部分,U3D截图只有3D部分。为了解决这个问题截屏时我们需要把这两张图片合成为一张全新的图片。这里再说一下用苹果私有API截图是可以同时将UI部分U3D部分保存为一张图片,不过有可能APPStore不能审核通过所以大家还是老老实实用合并的方法来做。
OK下面MOMO来说代码的实现过程
首先在Unity中创建一个全新的工程,在创建一个立方体对象,为了方便看效果我们写一条脚本让这个立方体的对象一直自转,这样播放出来的视频看的会比较清楚喔。
Test.cs直接挂在立方体对象之上。代码比较简单我就不解释了。

using UnityEngine;
using System.Collections;

public class Test : MonoBehaviour {

 int count = 0;
 void Start () {

 }

 // Update is called once per frame
 void Update ()
 {

  this.transform.Rotate(new Vector3(0,1,0));
 }

 //在这里OC的代码通知U3D截屏
 void StartScreenshot(string str)
 {
  Application.CaptureScreenshot(count +"u3d.JPG");
  count++;
 }
}


然后我们将这个Unity3D工程导出成IOS的项目 。Unity会生成对应的XCODE工程。我们写一个全新的ViewController覆盖在U3D生成的OPGL viewController之上,用于写UI高级控件,接着打开APPControll.mm文件。
在如下方法的末尾处添加代码
int OpenEAGL_UnityCallback(UIWindow** window, int* screenWidth, int* screenHeight,  int* openglesVersion)
{
 CGRect rect = [[UIScreen mainScreen] bounds];

 // Create a full-screen window
 _window = [[UIWindow alloc] initWithFrame:rect];
 EAGLView* view = [[EAGLView alloc] initWithFrame:rect];
 UnityViewController *controller = [[UnityViewController alloc] init];

 sGLViewController = controller;
 sGLView = view;

#if defined(__IPHONE_3_0)
 if( _ios30orNewer )
  controller.wantsFullScreenLayout = TRUE;
#endif

 controller.view = view;

 CreateSplashView( UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone ? (UIView*)_window : (UIView*)view );
 CreateActivityIndicator(_splashView);

 // add only now so controller have chance to reorient *all* added views
 [_window addSubview:view];
 if (UI_USER_INTERFACE_IDIOM() == UIUserInterfaceIdiomPhone)
  [_window bringSubviewToFront:_splashView];

 _autorotEnableHandling = true;
 [[NSNotificationCenter defaultCenter] postNotificationName: UIDeviceOrientationDidChangeNotification object: [UIDevice currentDevice]];

 // reposition activity indicator after we rotated views
 if (_activityIndicator)
  _activityIndicator.center = CGPointMake([_splashView bounds].size.width/2, [_splashView bounds].size.height/2);

 int openglesApi =
#if defined(__IPHONE_3_0) && USE_OPENGLES20_IF_AVAILABLE
 kEAGLRenderingAPIOpenGLES2;
#else
 kEAGLRenderingAPIOpenGLES1;
#endif

 for (; openglesApi >= kEAGLRenderingAPIOpenGLES1 && !_context; --openglesApi)
 {
  if (!UnityIsRenderingAPISupported(openglesApi))
   continue;

  _context = [[EAGLContext alloc] initWithAPI:openglesApi];
 }

 if (!_context)
  return false;

 if (![EAGLContext setCurrentContext:_context]) {
  _context = 0;
  return false;
 }

 const GLuint colorFormat = UnityUse32bitDisplayBuffer() ? GL_RGBA8_OES : GL_RGB565_OES;

 if (!CreateWindowSurface(view, colorFormat, GL_DEPTH_COMPONENT16_OES, UnityGetDesiredMSAASampleCount(MSAA_DEFAULT_SAMPLE_COUNT), NO, &_surface)) {
  return false;
 }

 glViewport(0, 0, _surface.w, _surface.h);
 [_window makeKeyAndVisible];
 [view release];

 *window = _window;
 *screenWidth = _surface.w;
 *screenHeight = _surface.h;
 *openglesVersion = _context.API;

 _glesContextCreated = true;

//--------------------下面的MyViewController就是我们新写的Contoller----------------
    MyViewController * myView =  [[MyViewController alloc] init];
    [sGLViewController.view addSubview:myView.view];
//--------------------上面的MyViewController就是我们新写的Contoller----------------
 return true;
}


如果你不是U3D项目请大家记得引入AVFoundation.formwork 和 MediaPlayer.framework ,因为U3D会自动将这两个fromWork生成出来

//
//  MyViewController.h
//  avcount
//
//  Created by 雨松MOMO on 12-9-14.
//  Copyright (c) 2012年 雨松MOMO. All rights reserved.
//

#import <UIKit/UIKit.h>
#import <Foundation/Foundation.h>
#import <AVFoundation/AVFoundation.h>
#import <MediaPlayer/MediaPlayer.h>  

@interface MyViewController : UIViewController<AVAudioRecorderDelegate,AVCaptureVideoDataOutputSampleBufferDelegate,AVCaptureAudioDataOutputSampleBufferDelegate>
{
  //时间计时器
  NSTimer *_timer;
  int _count;
  UILabel * _labe;
  //录音
  AVAudioRecorder * _recorder;
  //读取动画
  UITextView *_sharedLoadingTextView;
  UIActivityIndicatorView* _sharedActivityView;

}

@end

下面是具体的实现 ,核心的代码MOMO也是在网上学习老外的文章,最终将它们组合在了一起就完成了。研究了好几个小时,真实内牛满面啊~~~~  由于代码比较多,请大家一定要仔细阅读哦。。

//
//  MyViewController.m
//  avcount
//
//  Created by 雨松MOMO on 12-9-14.
//  Copyright (c) 2012年 雨松MOMO. All rights reserved.
//

#import "MyViewController.h"

@interface MyViewController ()

@end

@implementation MyViewController

- (id)initWithNibName:(NSString *)nibNameOrNil bundle:(NSBundle *)nibBundleOrNil
{
    self = [super initWithNibName:nibNameOrNil bundle:nibBundleOrNil];
    if (self) {
        // Custom initialization
    }
    return self;
}

- (void)viewDidLoad
{
    [super viewDidLoad];
    self.view.backgroundColor = [UIColor redColor];
    UIWindow *screenWindow = [[UIApplication sharedApplication] keyWindow];
    UIGraphicsBeginImageContext(screenWindow.frame.size);
    [screenWindow.layer renderInContext:UIGraphicsGetCurrentContext()];
    UIImage *viewImage = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();
    UIImageWriteToSavedPhotosAlbum(viewImage, nil, nil, nil);

#if !TARGET_IPHONE_SIMULATOR
    self.view.backgroundColor = [UIColor greenColor];
#else
    self.view.backgroundColor = [UIColor clearColor];
#endif

    UIButton * start = [UIButton buttonWithType:UIButtonTypeRoundedRect];
    [start setFrame:CGRectMake(0, 0, 200, 30)];
    [start setTitle:@"开始截屏" forState:UIControlStateNormal];
    [start addTarget:self action:@selector(startPress) forControlEvents:UIControlEventTouchDown];

    UIButton * end = [UIButton buttonWithType:UIButtonTypeRoundedRect];
    [end setFrame:CGRectMake(0, 50, 200, 30)];
    [end setTitle:@"结束截屏(开始录制视频)" forState:UIControlStateNormal];
    [end addTarget:self action:@selector(endPress) forControlEvents:UIControlEventTouchDown];

    [self.view addSubview:start];
    [self.view addSubview:end];

    _labe = [[[UILabel alloc]initWithFrame:CGRectMake(30, 200, 300, 30)]autorelease];
    _labe.text = [NSString stringWithFormat:@"%@%d",@"雨松MOMO开始计时:===  ",_count];
    [self.view addSubview:_labe];

    //初始化录音
    [self prepareToRecord];
}

-(void)addLoading:(NSString*) info
{
    //顶部文本视图
    _sharedLoadingTextView = [[[UITextView alloc] initWithFrame:CGRectMake(0, 0, 130, 130)] autorelease];
    [_sharedLoadingTextView setBackgroundColor:[UIColor blackColor]];
    [_sharedLoadingTextView setText:info];
    [_sharedLoadingTextView setTextColor:[UIColor whiteColor]];
    [_sharedLoadingTextView setTextAlignment:UITextAlignmentCenter];
    [_sharedLoadingTextView setFont:[UIFont systemFontOfSize:15]];
    _sharedLoadingTextView.textAlignment = UITextAlignmentCenter;
    _sharedLoadingTextView.alpha = 0.8f;
    _sharedLoadingTextView.center = self.view.center;
    _sharedLoadingTextView.layer.cornerRadius = 10;
    _sharedLoadingTextView.layer.masksToBounds = YES;

    //创建Loading动画视图
    _sharedActivityView = [[[UIActivityIndicatorView alloc] initWithActivityIndicatorStyle:UIActivityIndicatorViewStyleGray] autorelease];
    //设置动画视图的风格,这里设定它为白色
    _sharedActivityView.activityIndicatorViewStyle=UIActivityIndicatorViewStyleWhiteLarge;
    //设置它显示的区域
    _sharedActivityView.frame = CGRectMake(0,0, 320, 480);

    _sharedActivityView.center = self.view.center;
    //开始播放动画
    [_sharedActivityView startAnimating];

    [self.view addSubview:_sharedLoadingTextView];
    [self.view addSubview:_sharedActivityView];

}

-(void)removeLoading
{
    [_sharedLoadingTextView removeFromSuperview];
    [_sharedActivityView removeFromSuperview];
}

-(void)startPress
{

    _count = 0;
    _timer = [NSTimer scheduledTimerWithTimeInterval: 0.1
                                              target: self
                                            selector: @selector(heartBeat:)
                                            userInfo: nil
                                             repeats: YES];

    //开始录音
    [_recorder record];
}

-(void)endPress
{

    if(_timer != nil)
    {
        [_timer invalidate];
        _timer = nil;

        [self addLoading:@"开始制作视频"];
        [NSThread detachNewThreadSelector:@selector(startThreadMainMethod) toTarget:self withObject:nil];
   }
}

-(void)startThreadMainMethod
{
    //在这里制作视频
    NSMutableArray *_array = [[[NSMutableArray alloc]init]autorelease];
    NSString * Path = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];
    for(int i =0; i< _count; i++)
    {

        //读取存在沙盒里面的文件图片
        NSString *  _pathSecond = [NSString stringWithFormat:@"%@/%d%@",Path,i,@".JPG"];
        NSString *  _pathFirst = [NSString stringWithFormat:@"%@/%d%@",Path,i,@"u3d.JPG"];

        //因为拿到的是个路径 把它加载成一个data对象
        NSData *data0=[NSData dataWithContentsOfFile:_pathFirst];
        NSData *data1=[NSData dataWithContentsOfFile:_pathSecond];
        //直接把该 图片读出来
        UIImage *img0=[UIImage imageWithData:data0];
        UIImage *img1=[UIImage imageWithData:data1];

        [_array addObject:[self MergerImage : img0 : img1]];
    }

    Path = [NSString stringWithFormat:@"%@/%@%@",Path,@"veido",@".MP4"];

    [_recorder stop];
    [self writeImages:_array ToMovieAtPath:Path withSize: CGSizeMake(320, 480) inDuration:_count*0.1 byFPS:10];
    [self removeLoading];

    NSLog(@"recorder successfully");
    UIAlertView *recorderSuccessful = [[UIAlertView alloc] initWithTitle:@"" message:@"视频录制成功"
                                                                delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
    [recorderSuccessful show];
    [recorderSuccessful release];

}

- (void) heartBeat: (NSTimer*) timer
{
    _labe.text = [NSString stringWithFormat:@"%@%d",@"雨松MOMO开始计时:===  ",_count];

    //这个是私有API运气不好会被拒接
    //这个方法比较给力 可以直接把ios前端和 U3D中的所有图像都截取出来
    //extern CGImageRef UIGetScreenImage();
    //UIImage *image = [UIImage imageWithCGImage:UIGetScreenImage()];
    //UIImageWriteToSavedPhotosAlbum(image,nil,nil,nil);

    //保险起见还是用如下方法截图
    //这个方法不能截取U3D的图像
    UIWindow *screenWindow = [[UIApplication sharedApplication]keyWindow];
    UIGraphicsBeginImageContext(screenWindow.frame.size);
    [screenWindow.layer renderInContext:UIGraphicsGetCurrentContext()];
    UIImage *image = UIGraphicsGetImageFromCurrentImageContext();
    UIGraphicsEndImageContext();

    NSData *data;

    if (UIImagePNGRepresentation(image) == nil)
    {
        data = UIImageJPEGRepresentation(image, 1);
    }
    else
    {
        data = UIImagePNGRepresentation(image);
    }

    NSFileManager *fileManager = [NSFileManager defaultManager];

    NSString * Path = [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"];

    [fileManager createDirectoryAtPath:Path withIntermediateDirectories:YES attributes:nil error:nil];
    Path = [NSString stringWithFormat:@"%@/%d%@",Path,_count,@".JPG"];
    [fileManager createFileAtPath:Path contents:data attributes:nil];

    //通知U3D开始截屏
    UnitySendMessage("Cube","StartScreenshot","");

    _count++;
}

//合并图片,把ios前景图片和U3D图片合并在一起
-(UIImage*) MergerImage:(UIImage*) firstImg:(UIImage*) secondImg
{

    UIGraphicsBeginImageContext(CGSizeMake(320, 480));

    [firstImg drawInRect:CGRectMake(0, 0, firstImg.size.width, firstImg.size.height)];

    [secondImg drawInRect:CGRectMake(0, 0, secondImg.size.width, secondImg.size.height)];

    UIImage *resultImage=UIGraphicsGetImageFromCurrentImageContext();

    UIGraphicsEndImageContext();

    return resultImage;

}

- (void)viewDidUnload
{
    [super viewDidUnload];
    // Release any retained subviews of the main view.
}

- (BOOL)shouldAutorotateToInterfaceOrientation:(UIInterfaceOrientation)interfaceOrientation
{
    return (interfaceOrientation == UIInterfaceOrientationPortrait);
}

- (CVPixelBufferRef) pixelBufferFromCGImage: (CGImageRef) image andSize:(CGSize) size
{
    NSDictionary *options = [NSDictionary dictionaryWithObjectsAndKeys:
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGImageCompatibilityKey,
                             [NSNumber numberWithBool:YES], kCVPixelBufferCGBitmapContextCompatibilityKey,
                             nil];
    CVPixelBufferRef pxbuffer = NULL;

    CVReturn status = CVPixelBufferCreate(kCFAllocatorDefault, size.width,
                                          size.height, kCVPixelFormatType_32ARGB, (CFDictionaryRef) options,
                                          &pxbuffer);
    NSParameterAssert(status == kCVReturnSuccess && pxbuffer != NULL);

    CVPixelBufferLockBaseAddress(pxbuffer, 0);
    void *pxdata = CVPixelBufferGetBaseAddress(pxbuffer);
    NSParameterAssert(pxdata != NULL);

    CGColorSpaceRef rgbColorSpace = CGColorSpaceCreateDeviceRGB();
    CGContextRef context = CGBitmapContextCreate(pxdata, size.width,
                                                 size.height, 8, 4*size.width, rgbColorSpace,
                                                 kCGImageAlphaNoneSkipFirst);
    NSParameterAssert(context);
    CGContextConcatCTM(context, CGAffineTransformMakeRotation(0));
    CGContextDrawImage(context, CGRectMake(0, 0, CGImageGetWidth(image),
                                           CGImageGetHeight(image)), image);
    CGColorSpaceRelease(rgbColorSpace);
    CGContextRelease(context);

    CVPixelBufferUnlockBaseAddress(pxbuffer, 0);

    return pxbuffer;
}

- (void) writeImages:(NSArray *)imagesArray ToMovieAtPath:(NSString *) path withSize:(CGSize) size
          inDuration:(float)duration byFPS:(int32_t)fps
{

    //在这里将之前截取的图片合并成一个视频
    //Wire the writer:
    NSError *error = nil;
    AVAssetWriter *videoWriter = [[[AVAssetWriter alloc] initWithURL:[NSURL fileURLWithPath:path]
                                                            fileType:AVFileTypeQuickTimeMovie
                                                               error:&error] autorelease];
    NSParameterAssert(videoWriter);

    NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                   AVVideoCodecH264, AVVideoCodecKey,
                                   [NSNumber numberWithInt:size.width], AVVideoWidthKey,
                                   [NSNumber numberWithInt:size.height], AVVideoHeightKey,
                                   nil];

    AVAssetWriterInput* videoWriterInput = [[AVAssetWriterInput
                                             assetWriterInputWithMediaType:AVMediaTypeVideo
                                             outputSettings:videoSettings] retain];

    AVAssetWriterInputPixelBufferAdaptor *adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                     assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                     sourcePixelBufferAttributes:nil];
    NSParameterAssert(videoWriterInput);
    NSParameterAssert([videoWriter canAddInput:videoWriterInput]);
    [videoWriter addInput:videoWriterInput];

    //Start a session:
    [videoWriter startWriting];
    [videoWriter startSessionAtSourceTime:kCMTimeZero];

    //Write some samples:
    CVPixelBufferRef buffer = NULL;

    int frameCount = 0;

    int imagesCount = [imagesArray count];
    float averageTime = duration/imagesCount;
    int averageFrame = (int)(averageTime * fps);

    for(UIImage * img in imagesArray)
    {
        buffer = [self pixelBufferFromCGImage:[img CGImage] andSize:size];

        BOOL append_ok = NO;
        int j = 0;
        while (!append_ok && j < 30)
        {
            if (adaptor.assetWriterInput.readyForMoreMediaData)
            {
                printf("appending %d attemp %d\n", frameCount, j);

                CMTime frameTime = CMTimeMake(frameCount,(int32_t) fps);
                float frameSeconds = CMTimeGetSeconds(frameTime);
                NSLog(@"frameCount:%d,kRecordingFPS:%d,frameSeconds:%f",frameCount,fps,frameSeconds);
                append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];

                if(buffer)
                    [NSThread sleepForTimeInterval:0.05];
            }
            else
            {
                printf("adaptor not ready %d, %d\n", frameCount, j);
                [NSThread sleepForTimeInterval:0.1];
            }
            j++;
        }
        if (!append_ok) {
            printf("error appending image %d times %d\n", frameCount, j);
        }

        frameCount = frameCount + averageFrame;
    }

    //Finish the session:
    [videoWriterInput markAsFinished];
    [videoWriter finishWriting];
    NSLog(@"finishWriting");

    //将静态视频 和声音合并成一个新视频
    [self CompileFilesToMakeMovie];

}

- (void) prepareToRecord

{

    AVAudioSession *audioSession = [AVAudioSession sharedInstance];

    NSError *err = nil;

    [audioSession setCategory :AVAudioSessionCategoryPlayAndRecord error:&err];

    if(err){

        NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);

        return;

    }

    [audioSession setActive:YES error:&err];

    err = nil;

    if(err){

        NSLog(@"audioSession: %@ %d %@", [err domain], [err code], [[err userInfo] description]);

        return;

    }

    NSMutableDictionary *recordSetting = [[NSMutableDictionary alloc] init];

    [recordSetting setValue :[NSNumber numberWithInt:kAudioFormatLinearPCM] forKey:AVFormatIDKey];

    [recordSetting setValue:[NSNumber numberWithFloat:44100.0] forKey:AVSampleRateKey];

    [recordSetting setValue:[NSNumber numberWithInt: 2] forKey:AVNumberOfChannelsKey];

    [recordSetting setValue :[NSNumber numberWithInt:16] forKey:AVLinearPCMBitDepthKey];

    [recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsBigEndianKey];

    [recordSetting setValue :[NSNumber numberWithBool:NO] forKey:AVLinearPCMIsFloatKey];

    // Create a new dated file
    NSString * recorderFilePath = [[NSString stringWithFormat:@"%@/%@.caf", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], @"sound"] retain];
    NSURL *url = [NSURL fileURLWithPath:recorderFilePath];
    err = nil;
     _recorder = [[ AVAudioRecorder alloc] initWithURL:url settings:recordSetting error:&err];
    if(!_recorder){
        NSLog(@"recorder: %@ %d %@", [err domain], [err code], [[err userInfo] description]);
        UIAlertView *alert =
        [[UIAlertView alloc] initWithTitle: @"Warning"
                                   message: [err localizedDescription]
                                  delegate: nil
                         cancelButtonTitle:@"OK"
                         otherButtonTitles:nil];
        [alert show];
        [alert release];
        return;
    }
    //prepare to record
    [_recorder setDelegate:self];
    [_recorder prepareToRecord];
    _recorder.meteringEnabled = YES;
    BOOL audioHWAvailable = audioSession.inputIsAvailable;
    if (! audioHWAvailable) {
        UIAlertView *cantRecordAlert =
        [[UIAlertView alloc] initWithTitle: @"Warning"
                                   message: @"Audio input hardware not available"
                                  delegate: nil
                         cancelButtonTitle:@"OK"
                         otherButtonTitles:nil];
        [cantRecordAlert show];
        [cantRecordAlert release];
        return;
    }
}

//代理 这里可以监听录音成功
- (void)audioRecorderDidFinishRecording:(AVAudioRecorder *) aRecorder successfully:(BOOL)flag
{
//    NSLog(@"recorder successfully");
//    UIAlertView *recorderSuccessful = [[UIAlertView alloc] initWithTitle:@"" message:@"录音成功"
//                                                                delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
//    [recorderSuccessful show];
//    [recorderSuccessful release];
}

//代理 这里可以监听录音失败
- (void)audioRecorderEncodeErrorDidOccur:(AVAudioRecorder *)arecorder error:(NSError *)error
{

//    UIAlertView *recorderFailed = [[UIAlertView alloc] initWithTitle:@"" message:@"发生错误"
//                                                            delegate:self cancelButtonTitle:@"OK" otherButtonTitles:nil];
//    [recorderFailed show];
//    [recorderFailed release];
}

-(void)CompileFilesToMakeMovie
{
    //这个方法在沙盒中把静态图片组成的视频 与录制的声音合并成一个新视频
    AVMutableComposition* mixComposition = [AVMutableComposition composition];

    NSString* audio_inputFileName = @"sound.caf";
    NSString* audio_inputFilePath = [NSString stringWithFormat:@"%@/%@", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], audio_inputFileName] ;
    NSURL*    audio_inputFileUrl = [NSURL fileURLWithPath:audio_inputFilePath];

    NSString* video_inputFileName = @"veido.mp4";
    NSString* video_inputFilePath = [NSString stringWithFormat:@"%@/%@", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], video_inputFileName] ;
    NSURL*    video_inputFileUrl = [NSURL fileURLWithPath:video_inputFilePath];

    NSString* outputFileName = @"outputVeido.mov";
    NSString* outputFilePath = [NSString stringWithFormat:@"%@/%@", [NSHomeDirectory() stringByAppendingPathComponent:@"Documents"], outputFileName] ;
    NSURL*    outputFileUrl = [NSURL fileURLWithPath:outputFilePath];

    if ([[NSFileManager defaultManager] fileExistsAtPath:outputFilePath])
        [[NSFileManager defaultManager] removeItemAtPath:outputFilePath error:nil];

    CMTime nextClipStartTime = kCMTimeZero;

    AVURLAsset* videoAsset = [[AVURLAsset alloc]initWithURL:video_inputFileUrl options:nil];
    CMTimeRange video_timeRange = CMTimeRangeMake(kCMTimeZero,videoAsset.duration);
    AVMutableCompositionTrack *a_compositionVideoTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeVideo preferredTrackID:kCMPersistentTrackID_Invalid];
    [a_compositionVideoTrack insertTimeRange:video_timeRange ofTrack:[[videoAsset tracksWithMediaType:AVMediaTypeVideo] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    AVURLAsset* audioAsset = [[AVURLAsset alloc]initWithURL:audio_inputFileUrl options:nil];
    CMTimeRange audio_timeRange = CMTimeRangeMake(kCMTimeZero, audioAsset.duration);
    AVMutableCompositionTrack *b_compositionAudioTrack = [mixComposition addMutableTrackWithMediaType:AVMediaTypeAudio preferredTrackID:kCMPersistentTrackID_Invalid];
    [b_compositionAudioTrack insertTimeRange:audio_timeRange ofTrack:[[audioAsset tracksWithMediaType:AVMediaTypeAudio] objectAtIndex:0] atTime:nextClipStartTime error:nil];

    AVAssetExportSession* _assetExport = [[AVAssetExportSession alloc] initWithAsset:mixComposition presetName:AVAssetExportPresetHighestQuality];
    _assetExport.outputFileType = @"com.apple.quicktime-movie";
    _assetExport.outputURL = outputFileUrl;

    [_assetExport exportAsynchronouslyWithCompletionHandler:
     ^(void ) {
     }
     ];

}

@end


如下图所示,高级控件的按钮属于UI部分,后面的立方体对象是U3D生成,并且立方体对象在一直的旋转。点击开始截屏按钮时OC 部分 和U3D会每一帧同时截屏,并且此时开始录音。点击结束截屏按钮时,程序先将OC和U3D截屏的图片每一帧两两的组合成一个新图片,然后生成没有声音的视频。 最后将没有声音的视频 和刚刚录制的音频组合成一个全新的视频存在沙盒中即可。


此时我们看一下模拟器中的沙盒文件,”数字”.JPG就是OC截取的图片, “数字+U3D”.JPG就是U3D中截取的图片。Sound.caf就是录制的音频文件,veido.mp4 就是将连续的图片组合成的无声音视频文件,最后的outputVeido.mov就是将无声音的视频文件与音频文件组合成的新视频文件。


双击打开outputVeido.mov视频文件,我们可以直接在QuickTimePlayer中播放它,怎么样功能很给力吧,哈哈哈。U3D也能轻松的实现截屏功能,哈哈哈~~


最后我在说一下,这个代码同时也适用于IOS普通的软件项目中,U3D只是多做了一步合成图片的功能,所有代码都写在MyViewController中请大家仔细看喔。 这两天MOMO还会抽时间研究一下在Android 下如何截屏录制视频,请大家拭目以待噢,哇咔咔。因为U3D生成的工程比较大,所以我就不上传U3D生成的XCODE代码了,我给出一个纯OC代码的下载地址,最后雨松MOMO祝大家学习愉快。