Export Wav files using AVAssetExportSession

I am trying to add attenuation to a wav file and then export a new file with added attenuation using AVAssetExportSession . All the examples I saw are exported as m4u. Is it possible to do this using wav or aif?

The error I am getting is:

 AVAssetExportSessionStatusFailed Error Domain=AVFoundationErrorDomain Code=-11822 "Cannot Open" UserInfo=0x1f01c9f0 {NSLocalizedDescription=Cannot Open, NSLocalizedFailureReason=This media format is not supported.} 

My code looks below

  NSString *inpath = [path stringByAppendingFormat:@"/%@",file]; NSString *ename = [file stringByDeletingPathExtension]; NSString *incname = [ename stringByAppendingString:@"1t"]; NSString *outname = [incname stringByAppendingPathExtension:@"wav"]; NSString *outpath = [path stringByAppendingFormat:@"/%@",outname]; NSURL *urlpath = [NSURL fileURLWithPath:inpath]; NSURL *urlout = [NSURL fileURLWithPath:outpath]; NSDictionary *options = [NSDictionary dictionaryWithObject:[NSNumber numberWithBool:YES] forKey:AVURLAssetPreferPreciseDurationAndTimingKey]; AVURLAsset *anAsset = [[AVURLAsset alloc] initWithURL:urlpath options:options]; //check the soundfile is greater than 50seconds CMTime assetTime = [anAsset duration]; Float64 duration = CMTimeGetSeconds(assetTime); if (duration < 50.0) return NO; // get the first audio track NSArray *tracks = [anAsset tracksWithMediaType:AVMediaTypeAudio]; if ([tracks count] == 0) return NO; AVAssetTrack *track = [tracks objectAtIndex:0]; // create trim time range - 20 seconds starting from 30 seconds into the asset CMTime startTime = CMTimeMake(30, 1); CMTime stopTime = CMTimeMake(50, 1); CMTimeRange exportTimeRange = CMTimeRangeFromTimeToTime(startTime, stopTime); // create fade in time range - 10 seconds starting at the beginning of trimmed asset CMTime startFadeInTime = startTime; CMTime endFadeInTime = CMTimeMake(40, 1); CMTimeRange fadeInTimeRange = CMTimeRangeFromTimeToTime(startFadeInTime, endFadeInTime); // setup audio mix AVMutableAudioMix *exportAudioMix = [AVMutableAudioMix audioMix]; AVMutableAudioMixInputParameters *exportAudioMixInputParameters = [AVMutableAudioMixInputParameters audioMixInputParametersWithTrack:track]; [exportAudioMixInputParameters setVolumeRampFromStartVolume:0.0 toEndVolume:1.0 timeRange:fadeInTimeRange]; exportAudioMix.inputParameters = [NSArray arrayWithObject:exportAudioMixInputParameters]; AVAssetExportSession *exportSession = [AVAssetExportSession exportSessionWithAsset:anAsset presetName:AVAssetExportPresetPassthrough]; //NSArray *listof = [AVAssetExportSession exportPresetsCompatibleWithAsset:anAsset]; //NSLog(@"LISTOF %@",listof); id desc = [track.formatDescriptions objectAtIndex:0]; const AudioStreamBasicDescription *audioDesc = CMAudioFormatDescriptionGetStreamBasicDescription((CMAudioFormatDescriptionRef)desc); FourCharCode formatID = audioDesc->mFormatID; NSString *fileType = nil; NSString *ex = nil; switch (formatID) { case kAudioFormatLinearPCM: { UInt32 flags = audioDesc->mFormatFlags; if (flags & kAudioFormatFlagIsBigEndian) { fileType = @"public.aiff-audio"; ex = @"aif"; } else { fileType = @"com.microsoft.waveform-audio"; ex = @"wav"; } } break; case kAudioFormatMPEGLayer3: fileType = @"com.apple.quicktime-movie"; ex = @"mp3"; break; case kAudioFormatMPEG4AAC: fileType = @"com.apple.m4a-audio"; ex = @"m4a"; break; case kAudioFormatAppleLossless: fileType = @"com.apple.m4a-audio"; ex = @"m4a"; break; default: break; } exportSession.outputFileType = fileType; exportSession.outputURL = urlout; //exportSession.outputFileType = AVFileTypeWAVE; // output file type exportSession.timeRange = exportTimeRange; // trim time range exportSession.audioMix = exportAudioMix; // fade in audio mix // perform the export [exportSession exportAsynchronouslyWithCompletionHandler:^{ if (AVAssetExportSessionStatusCompleted == exportSession.status) { NSLog(@"AVAssetExportSessionStatusCompleted"); } else if (AVAssetExportSessionStatusFailed == exportSession.status) { // a failure may happen because of an event out of your control // for example, an interruption like a phone call comming in // make sure and handle this case appropriately NSLog(@"AVAssetExportSessionStatusFailed %@",exportSession.error); } else { NSLog(@"Export Session Status: %d", exportSession.status); } }]; return YES; } 
+4
source share
1 answer

You cannot do this with AVAssetExportSession because the presets are fully fixed in their use. The preset value of AVAssetExportPresetPassthrough will save your input formats in the output.

Since your task will be to manipulate the audio data buffers directly, you should use the second option that AVFoundation will give you: paired configuration of AVAssetReader and AVAssetWriter. You will find the correct code example, as in AVReaderWriterOSX, from an Apple developer source. This should also work with iOS, in addition, you have different input / output format settings. Accessibility for unpacking audio as PCM and writing back to an uncompressed .wav file should be indicated.

+2
source

All Articles