如何在iOS和Xcode中记录完美循环

用户名

我一直在努力解决这一问题,现在试图解决我的问题并代表其他人看。

我一直在写像录音一样依赖“ GarageBand”的应用。也就是说,我想为用户录制8个节拍,然后让他们能够循环播放。我同时在为用户播放节拍器(用户将戴着耳机听节拍器,并记录到他们设备上的麦克风中)

我可以设法打开录音约4.8秒(.6 * 8拍),计时器说它已经运行了4.8秒,但是我的录音总是比4.8短。就像4.78或4.71,这会导致循环中断。

我已经对AVAudioRecorder,AudioQueue和AudioUnits进行了实验,认为后一种方法可能会解决我的问题。

我正在使用NSTimer每.6秒触发一次节拍器的短暂提示。4拍后,节拍器定时器的功能,打开录音机节拍器,等待4.6秒停止录音。

我使用时间间隔来计时地铁运行的时间(在4.800xxx处看起来很紧),并将其与音频文件的持续时间进行比较,音频文件的持续时间总是不同的。

我希望我可以附加我的项目,但我想我只需要附加标题和实现即可。要进行测试,您必须制作一个具有以下IB特性的项目:

录制,播放,停止按钮乐曲/曲目持续时间标签计时器持续时间标签调试标签

如果启动该应用程序,然后创下记录,您将以4拍“计数”,然后录音机启动。在桌上轻按您的手指,直到录音机停止。多拍8次(总共12次)后,录音机停止。

您可以在显示屏中看到,录制的曲目比4.8秒短一些,在某些情况下还短得多,从而导致音频无法正确循环。

有谁知道我能做些什么来加强这一点?谢谢阅读。

这是我的代码:

//
//  ViewController.h
//  speakagain
//
//  Created by NOTHING on 2014-03-18.
//

#import <UIKit/UIKit.h>
#import <Foundation/Foundation.h>
#import "CoreAudio/CoreAudioTypes.h"
#import <AudioToolbox/AudioQueue.h>
#import <AudioToolbox/AudioFile.h>
#import <AVFoundation/AVFoundation.h>

@interface ViewController : UIViewController
{
    IBOutlet UIButton *btnRecord;
    IBOutlet UIButton *btnPlay;
    IBOutlet UIButton *btnStop;
    IBOutlet UILabel *debugLabel;
    IBOutlet UILabel *timerDuration;
    IBOutlet UILabel *songDuration;

    //UILabel *labelDebug;

    struct AQRecorderState {
        AudioStreamBasicDescription  mDataFormat;
        AudioQueueRef                mQueue;
        AudioQueueBufferRef          mBuffers[kNumberBuffers];
        AudioFileID                  mAudioFile;
        UInt32                       bufferByteSize;
        SInt64                       mCurrentPacket;
        bool                         mIsRunning;                    // 8

    };
    struct AQRecorderState aqData;
    AVAudioPlayer *audioPlayer;

    NSString *songName;
    NSTimer *recordTimer;
    NSTimer *metroTimer;
    NSTimeInterval startTime, endTime, elapsedTime;

    int inputBuffer;
    int beatNumber;

}
@property (nonatomic, retain)   IBOutlet UIButton *btnRecord;
@property (nonatomic, retain)   IBOutlet UIButton *btnPlay;
@property (nonatomic, retain)   IBOutlet UIButton *btnStop;
@property (nonatomic, retain)   IBOutlet UILabel *debugLabel;
@property (nonatomic, retain) IBOutlet UILabel *timerDuration;
@property (nonatomic, retain) IBOutlet UILabel *songDuration;


- (IBAction) record;
- (IBAction) stop;
- (IBAction) play;

static void HandleInputBuffer (void *aqData,AudioQueueRef inAQ,AudioQueueBufferRef inBuffer,const AudioTimeStamp *inStartTime, UInt32 inNumPackets,const AudioStreamPacketDescription  *inPacketDesc);

@end

实现方式:

//
    //  ViewController.m
    //  speakagain
    //
    //  Created by NOTHING on 2014-03-18.
    //

    #import "ViewController.h"


    @interface ViewController ()

    @end

    @implementation ViewController
    @synthesize btnPlay, btnRecord,btnStop,songDuration, timerDuration, debugLabel;


    - (void)viewDidLoad
    {
        debugLabel.text = @"";
        songName =[[NSString alloc ]init];
        //NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        //NSString *documentsDirectory = [paths objectAtIndex:0];
        songName = @"TestingQueue.caf";



        [super viewDidLoad];
        // Do any additional setup after loading the view, typically from a nib.
    }
    - (void)prepareAudioQueue
    {
        //struct AQRecorderState *pAqData;
        inputBuffer=0;
        aqData.mDataFormat.mFormatID         = kAudioFormatLinearPCM;
        aqData.mDataFormat.mSampleRate       = 44100.0;
        aqData.mDataFormat.mChannelsPerFrame = 1;
        aqData.mDataFormat.mBitsPerChannel   = 16;
        aqData.mDataFormat.mBytesPerPacket   =
        aqData.mDataFormat.mBytesPerFrame = aqData.mDataFormat.mChannelsPerFrame * sizeof (SInt16);
        aqData.mDataFormat.mFramesPerPacket  = 1;

        //    AudioFileTypeID fileType             = kAudioFileAIFFType;
        AudioFileTypeID fileType             = kAudioFileCAFType;
        aqData.mDataFormat.mFormatFlags = kLinearPCMFormatFlagIsBigEndian| kLinearPCMFormatFlagIsSignedInteger| kLinearPCMFormatFlagIsPacked;

        AudioQueueNewInput (&aqData.mDataFormat,HandleInputBuffer, &aqData,NULL, kCFRunLoopCommonModes, 0,&aqData.mQueue);

        UInt32 dataFormatSize = sizeof (aqData.mDataFormat);

        // in Mac OS X, instead use
        //    kAudioConverterCurrentInputStreamDescription
        AudioQueueGetProperty (aqData.mQueue,kAudioQueueProperty_StreamDescription,&aqData.mDataFormat,&dataFormatSize);

        //Verify
        NSFileManager *fileManager = [NSFileManager defaultManager];
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsDirectory = [paths objectAtIndex:0];
        NSString *txtPath = [documentsDirectory stringByAppendingPathComponent:songName];

        NSLog(@"INITIALIZING FILE");
        if ([fileManager fileExistsAtPath:txtPath] == YES) {
            NSLog(@"PREVIOUS FILE REMOVED");
            [fileManager removeItemAtPath:txtPath error:nil];
        }


        const char *filePath = [txtPath UTF8String];
        CFURLRef audioFileURL = CFURLCreateFromFileSystemRepresentation ( NULL,(const UInt8 *) filePath,strlen (filePath),false );
        AudioFileCreateWithURL (audioFileURL,fileType,&aqData.mDataFormat, kAudioFileFlags_EraseFile,&aqData.mAudioFile );

        DeriveBufferSize (aqData.mQueue,aqData.mDataFormat,0.5,&aqData.bufferByteSize);

        for (int i = 0; i < kNumberBuffers; ++i)
        {
            AudioQueueAllocateBuffer (aqData.mQueue,aqData.bufferByteSize,&aqData.mBuffers[i]);
            AudioQueueEnqueueBuffer (aqData.mQueue,aqData.mBuffers[i], 0,NULL );
        }

    }

    - (void) metronomeFire
    {
        if(beatNumber < 5)
        {
            //count in time.
            // just play the metro beep but don't start recording
            debugLabel.text = @"count in (1,2,3,4)";
            [self playSound];
        }
        if(beatNumber == 5)
        {
            //start recording
            aqData.mCurrentPacket = 0;
            aqData.mIsRunning = true;
            startTime = [NSDate timeIntervalSinceReferenceDate];
            recordTimer = [NSTimer scheduledTimerWithTimeInterval:4.8 target:self selector:@selector(killTimer) userInfo:nil repeats:NO];
            AudioQueueStart (aqData.mQueue,NULL);
            debugLabel.text = @"Recording for 8 beats (1,2,3,4 1,2,3,4)";
            [self playSound];
        }
        else if (beatNumber < 12)
        {   //play metronome from beats 6-16
            [self playSound];
        }
        if(beatNumber == 12)
        {
            [metroTimer invalidate]; metroTimer = nil;
            [self playSound];
        }

        beatNumber++;

    }
    - (IBAction) play
    {
        NSArray *paths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES);
        NSString *documentsDirectory = [paths objectAtIndex:0];
        NSString *txtPath = [documentsDirectory stringByAppendingPathComponent:songName];
        NSURL *url = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@",txtPath]];

        if (audioPlayer)
        {
            [audioPlayer stop];
            audioPlayer = nil;
        }
        NSError *error;
        audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:&error];

        if (audioPlayer == nil)
        {
            NSLog(@"%@",[error description]);
        }
        else
        {
            [audioPlayer play];
            [audioPlayer setNumberOfLoops:-1];
        }
    }
    - (void) killTimer
    {
        //this is the timer function.  Runs once after 4.8 seconds.
       [self stop];

    }
    - (IBAction) stop
    {
        if (audioPlayer)
        {
            [audioPlayer stop];
            audioPlayer = nil;



        }
        else
        {

            if(metroTimer)
            {
                [metroTimer invalidate];metroTimer = nil;
            }
            //Stop the audio queue
            AudioQueueStop (aqData.mQueue,true);
            aqData.mIsRunning = false;
            AudioQueueDispose (aqData.mQueue,true);
            AudioFileClose (aqData.mAudioFile);

            //Get elapsed time of timer
            endTime = [NSDate timeIntervalSinceReferenceDate];
            elapsedTime = endTime - startTime;

            //Get elapsed time of audio file
            NSArray *pathComponents = [NSArray arrayWithObjects:
                                       [NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES) lastObject],
                                       songName,
                                       nil];
            NSURL *audioFileURL = [NSURL fileURLWithPathComponents:pathComponents];
            AVURLAsset* audioAsset = [AVURLAsset URLAssetWithURL:audioFileURL options:nil];
            CMTime audioDuration = audioAsset.duration;
            float audioDurationSeconds = CMTimeGetSeconds(audioDuration);

            //Log values
            NSLog(@"Track Duration: %f",audioDurationSeconds);
            NSLog(@"Timer Duration: %.6f", elapsedTime);

            //Show values on GUI too
            songDuration.text = [NSString stringWithFormat: @"Track Duration: %f",audioDurationSeconds];
            timerDuration.text = [NSString stringWithFormat:@"Timer Duration: %@",[NSString stringWithFormat: @"%.6f", elapsedTime]];
            debugLabel.text = @"Why is the duration of the track less than the duration the timer ran?";
        }


    }
    -(void) playSound
    {
        NSString *path = [[NSBundle mainBundle] pathForResource:@"blip2" ofType:@"aif"];
        SystemSoundID soundID;
        AudioServicesCreateSystemSoundID((__bridge CFURLRef)[NSURL fileURLWithPath:path],  &soundID);
        AudioServicesPlaySystemSound (soundID);
    }

    - (IBAction) record
    {
        [self prepareAudioQueue];
        songDuration.text = @"";
        timerDuration.text = @"";
        //debugLabel.text = @"Please wait 12 beats (The first four are count in)";
        //init beat number
        beatNumber = 1;

        //safe guard
        if(aqData.mIsRunning)
        {
            AudioQueueStop (aqData.mQueue,true);

            aqData.mIsRunning = false;

            AudioQueueDispose (aqData.mQueue,true);
            AudioFileClose (aqData.mAudioFile);
        }

        //start count in (metro will start recording)
        //aqData.mCurrentPacket = 0;
        //aqData.mIsRunning = true;
        startTime = [NSDate timeIntervalSinceReferenceDate];
        metroTimer = [NSTimer scheduledTimerWithTimeInterval:.6 target:self selector:@selector(metronomeFire) userInfo:nil repeats:YES];
        //recordTimer = [NSTimer scheduledTimerWithTimeInterval:4.8 target:self selector:@selector(killTimer) userInfo:nil repeats:NO];
        //AudioQueueStart (aqData.mQueue,NULL);

    }
    static void HandleInputBuffer (void *aqData,AudioQueueRef inAQ,AudioQueueBufferRef inBuffer,const AudioTimeStamp *inStartTime,UInt32 inNumPackets,const AudioStreamPacketDescription *inPacketDesc)
    {
        //boiler plate
        NSLog(@"HandleInputBuffer");

        struct AQRecorderState *pAqData = (struct AQRecorderState *) aqData;

        if (inNumPackets == 0 && pAqData->mDataFormat.mBytesPerPacket != 0)
            inNumPackets = inBuffer->mAudioDataByteSize / pAqData->mDataFormat.mBytesPerPacket;

        if (AudioFileWritePackets (pAqData->mAudioFile,false,inBuffer->mAudioDataByteSize,inPacketDesc,pAqData->mCurrentPacket,&inNumPackets,inBuffer->mAudioData) == noErr)
        {
            pAqData->mCurrentPacket += inNumPackets;
        }

        if (pAqData->mIsRunning == 0)
            return;

        AudioQueueEnqueueBuffer (pAqData->mQueue,inBuffer,0,NULL);
    }

    void DeriveBufferSize(AudioQueueRef audioQueue,AudioStreamBasicDescription ASBDescription,Float64 seconds,UInt32 *outBufferSize)
    {
        //boiler plate
        static const int maxBufferSize = 0x50000;
        int maxPacketSize = ASBDescription.mBytesPerPacket;
        if(maxPacketSize == 0)
        {
            UInt32 maxVBRPacketSize = sizeof(maxPacketSize);
            AudioQueueGetProperty(audioQueue, kAudioQueueProperty_MaximumOutputPacketSize, &maxPacketSize, &maxVBRPacketSize);
            NSLog(@"max buffer = %d",maxPacketSize);
        }
        Float64 numBytesForTime = ASBDescription.mSampleRate * maxPacketSize * seconds;
        *outBufferSize = (UInt32)(numBytesForTime < maxBufferSize ? numBytesForTime : maxBufferSize);
    }

    OSStatus SetMagicCookieForFile (AudioQueueRef inQueue, AudioFileID inFile)
    {
        //boiler plate
        OSStatus result = noErr;
        UInt32 cookieSize;
        if (AudioQueueGetPropertySize (inQueue,kAudioQueueProperty_MagicCookie,&cookieSize) == noErr)
        {
            char* magicCookie =(char *) malloc (cookieSize);
            if (AudioQueueGetProperty (inQueue,kAudioQueueProperty_MagicCookie,magicCookie,&cookieSize) == noErr)
            {
                result =    AudioFileSetProperty (inFile,kAudioFilePropertyMagicCookieData,cookieSize,magicCookie);
            }

            free (magicCookie);
        }
        return result;

    }













    - (void)didReceiveMemoryWarning
    {
        [super didReceiveMemoryWarning];
        // Dispose of any resources that can be recreated.
    }
    @end
马克斯·麦克劳德

这是一个很大的话题,所以我怀疑您会得到足够大的答案来重新架构所提供的代码。但是,我可以给您链接,这些链接将提供您所需的绝大多数内容。

NSTimer由于同步问题,第一件事永远不会起作用。另外,忘记AudioQueueAVAudioRecorder只有AudioUnit低水平足以满足您的需求。

在这里看看我的答案:

iOS将音频从一台iOS设备传输到另一台iOS设备

但是,真正的金矿-以及您需要非常熟悉的知识-是Tasty Pixel的博客。Delicious Pixel是Loopy HD的供应商,但还是一位善良的人,可以分享一些非常深入的知识。

看到:

一种简单,快速的循环缓冲区实现,用于音频处理

开发Loopy,第2部分:实现

使用RemoteIO音频单元

最后,请确保您熟悉数据包,帧,样本等。一切都需要完美同步。

本文收集自互联网,转载请注明来源。

如有侵权,请联系[email protected] 删除。

编辑于
0

我来说两句

0条评论
登录后参与评论

相关文章

来自分类Dev

如何在iOS / xcode中重用颜色和样式?

来自分类Dev

如何在XCode 6 iOS Simulator中运行/记录iOS应用?

来自分类Dev

如何在Xcode 6.1中安装iOS 7.0和iOS 8.0模拟器?

来自分类Dev

如何在 HTML 中完美同步媒体?

来自分类Dev

如何在Swift iOS中读取和记录图像的原始像素

来自分类Dev

如何在xcode 5和ios 7中设置默认语言环境?

来自分类Dev

如何在iOS应用中获取DNS TXT记录

来自分类Dev

如何在iOS的Pod文件中更新神奇记录sdk

来自分类Dev

如何在FMDB iOS中启用预写日志记录?

来自分类常见问题

如何在Hive中删除和更新记录

来自分类Dev

如何在文件中记录HttpRequest和HttpResponse?

来自分类Dev

如何在Ember中查找和删除特定记录

来自分类Dev

如何在Windows 7中记录启动和关闭时间?

来自分类Dev

如何在文件中记录HttpRequest和HttpResponse?

来自分类Dev

如何在codeigniter php中开始和限制每页记录

来自分类Dev

如何在通用lambda中完美转发`auto &&`?

来自分类Dev

如何在Android中验证(数字)像素完美的开发?

来自分类Dev

如何在Android片段中完美显示软键盘?

来自分类Dev

如何在容器中完美地使div居中

来自分类Dev

如何在循环中追加记录?

来自分类Dev

纯文本在iOS底部被截断,如何在xCode中修复?

来自分类Dev

如何在Xcode中模拟iOS应用崩溃?

来自分类Dev

如何在Xcode中更改iOS推送通知的颜色?

来自分类Dev

如何在iOS XCode中为JSON定义结构?

来自分类Dev

如何在Xcode / iOS中调用Watson Personality traits API?

来自分类Dev

如何在Xcode中更改iOS推送通知的颜色?

来自分类Dev

如何在Xcode中的iOS中使用libarchive?

来自分类Dev

如何在Xcode 7.1中减小ios APP的大小

来自分类Dev

如何在iOS 10的Xcode 8中创建IBAction

Related 相关文章

  1. 1

    如何在iOS / xcode中重用颜色和样式?

  2. 2

    如何在XCode 6 iOS Simulator中运行/记录iOS应用?

  3. 3

    如何在Xcode 6.1中安装iOS 7.0和iOS 8.0模拟器?

  4. 4

    如何在 HTML 中完美同步媒体?

  5. 5

    如何在Swift iOS中读取和记录图像的原始像素

  6. 6

    如何在xcode 5和ios 7中设置默认语言环境?

  7. 7

    如何在iOS应用中获取DNS TXT记录

  8. 8

    如何在iOS的Pod文件中更新神奇记录sdk

  9. 9

    如何在FMDB iOS中启用预写日志记录?

  10. 10

    如何在Hive中删除和更新记录

  11. 11

    如何在文件中记录HttpRequest和HttpResponse?

  12. 12

    如何在Ember中查找和删除特定记录

  13. 13

    如何在Windows 7中记录启动和关闭时间?

  14. 14

    如何在文件中记录HttpRequest和HttpResponse?

  15. 15

    如何在codeigniter php中开始和限制每页记录

  16. 16

    如何在通用lambda中完美转发`auto &&`?

  17. 17

    如何在Android中验证(数字)像素完美的开发?

  18. 18

    如何在Android片段中完美显示软键盘?

  19. 19

    如何在容器中完美地使div居中

  20. 20

    如何在循环中追加记录?

  21. 21

    纯文本在iOS底部被截断,如何在xCode中修复?

  22. 22

    如何在Xcode中模拟iOS应用崩溃?

  23. 23

    如何在Xcode中更改iOS推送通知的颜色?

  24. 24

    如何在iOS XCode中为JSON定义结构?

  25. 25

    如何在Xcode / iOS中调用Watson Personality traits API?

  26. 26

    如何在Xcode中更改iOS推送通知的颜色?

  27. 27

    如何在Xcode中的iOS中使用libarchive?

  28. 28

    如何在Xcode 7.1中减小ios APP的大小

  29. 29

    如何在iOS 10的Xcode 8中创建IBAction

热门标签

归档