Sound transmission (wav) in javascript from lens c

I am recording a sound file (wav format) in object C. I want to pass this back to Javascript using Objective-C stringByEvaluatingJavaScriptFromString . I think I will have to convert the wav file to a base64 string in order to pass it to this function. Then I will need to convert the base64 string back to the (wav / blob) format in javascript in order to pass it into an audio tag in order to play it. I donโ€™t know how can I do this? Also not sure if this is the best way to transfer the wave file back to javascript? Any ideas would be appreciated.

+1
javascript objective-c ios6 audio
source share
2 answers

Well, that was not as I expected. so this is how I was able to achieve this.

Step 1: I recorded audio in caf format using AudioRecorder.

NSArray *dirPaths; NSString *docsDir; dirPaths = NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES); docsDir = [dirPaths objectAtIndex:0]; soundFilePath = [docsDir stringByAppendingPathComponent:@"sound.caf"]; NSURL *soundFileURL = [NSURL fileURLWithPath:soundFilePath]; NSDictionary *recordSettings = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:AVAudioQualityMin], AVEncoderAudioQualityKey, [NSNumber numberWithInt:16], AVEncoderBitRateKey, [NSNumber numberWithInt:2], AVNumberOfChannelsKey, [NSNumber numberWithFloat:44100], AVSampleRateKey, nil]; NSError *error = nil; audioRecorder = [[AVAudioRecorder alloc] initWithURL:soundFileURL settings:recordSettings error:&error]; if(error) { NSLog(@"error: %@", [error localizedDescription]); } else { [audioRecorder prepareToRecord]; } 

after that you just need to call audioRecorder.record to record the sound. It will be recorded in caf format. If you want to see my recordAudio function, here it is.

  (void) recordAudio { if(!audioRecorder.recording) { _playButton.enabled = NO; _recordButton.title = @"Stop"; [audioRecorder record]; [self animate1:nil finished:nil context:nil]; } else { [_recordingImage stopAnimating]; [audioRecorder stop]; _playButton.enabled = YES; _recordButton.title = @"Record"; } } 

Step 2. Convert caf format to wav format. I was able to accomplish this using the following function.

  -(BOOL)exportAssetAsWaveFormat:(NSString*)filePath { NSError *error = nil ; NSDictionary *audioSetting = [NSDictionary dictionaryWithObjectsAndKeys: [ NSNumber numberWithFloat:44100.0], AVSampleRateKey, [ NSNumber numberWithInt:2], AVNumberOfChannelsKey, [ NSNumber numberWithInt:16], AVLinearPCMBitDepthKey, [ NSNumber numberWithInt:kAudioFormatLinearPCM], AVFormatIDKey, [ NSNumber numberWithBool:NO], AVLinearPCMIsFloatKey, [ NSNumber numberWithBool:0], AVLinearPCMIsBigEndianKey, [ NSNumber numberWithBool:NO], AVLinearPCMIsNonInterleaved, [ NSData data], AVChannelLayoutKey, nil ]; NSString *audioFilePath = filePath; AVURLAsset * URLAsset = [[AVURLAsset alloc] initWithURL:[NSURL fileURLWithPath:audioFilePath] options:nil]; if (!URLAsset) return NO ; AVAssetReader *assetReader = [AVAssetReader assetReaderWithAsset:URLAsset error:&error]; if (error) return NO; NSArray *tracks = [URLAsset tracksWithMediaType:AVMediaTypeAudio]; if (![tracks count]) return NO; AVAssetReaderAudioMixOutput *audioMixOutput = [AVAssetReaderAudioMixOutput assetReaderAudioMixOutputWithAudioTracks:tracks audioSettings :audioSetting]; if (![assetReader canAddOutput:audioMixOutput]) return NO ; [assetReader addOutput :audioMixOutput]; if (![assetReader startReading]) return NO; NSString *title = @"WavConverted"; NSArray *docDirs = NSSearchPathForDirectoriesInDomains (NSDocumentDirectory, NSUserDomainMask, YES); NSString *docDir = [docDirs objectAtIndex: 0]; NSString *outPath = [[docDir stringByAppendingPathComponent :title] stringByAppendingPathExtension:@"wav" ]; if(![[NSFileManager defaultManager] removeItemAtPath:outPath error:NULL]) { return NO; } soundFilePath = outPath; NSURL *outURL = [NSURL fileURLWithPath:outPath]; AVAssetWriter *assetWriter = [AVAssetWriter assetWriterWithURL:outURL fileType:AVFileTypeWAVE error:&error]; if (error) return NO; AVAssetWriterInput *assetWriterInput = [ AVAssetWriterInput assetWriterInputWithMediaType :AVMediaTypeAudio outputSettings:audioSetting]; assetWriterInput. expectsMediaDataInRealTime = NO; if (![assetWriter canAddInput:assetWriterInput]) return NO ; [assetWriter addInput :assetWriterInput]; if (![assetWriter startWriting]) return NO; //[assetReader retain]; //[assetWriter retain]; [assetWriter startSessionAtSourceTime:kCMTimeZero ]; dispatch_queue_t queue = dispatch_queue_create( "assetWriterQueue", NULL ); [assetWriterInput requestMediaDataWhenReadyOnQueue:queue usingBlock:^{ NSLog(@"start"); while (1) { if ([assetWriterInput isReadyForMoreMediaData] && (assetReader.status == AVAssetReaderStatusReading)) { CMSampleBufferRef sampleBuffer = [audioMixOutput copyNextSampleBuffer]; if (sampleBuffer) { [assetWriterInput appendSampleBuffer :sampleBuffer]; CFRelease(sampleBuffer); } else { [assetWriterInput markAsFinished]; break; } } } [assetWriter finishWriting]; //[self playWavFile]; NSError *err; NSData *audioData = [NSData dataWithContentsOfFile:soundFilePath options: 0 error:&err]; [self.audioDelegate doneRecording:audioData]; //[assetReader release ]; //[assetWriter release ]; NSLog(@"soundFilePath=%@",soundFilePath); NSDictionary *dict = [[NSFileManager defaultManager] attributesOfItemAtPath:soundFilePath error:&err]; NSLog(@"size of wav file = %@",[dict objectForKey:NSFileSize]); //NSLog(@"finish"); }]; 

In this function, I call the audioDelegate doneRecording with audioData function, which is in wav format. Here is the code for doneRecording.

 -(void) doneRecording:(NSData *)contents { myContents = [[NSData dataWithData:contents] retain]; [self returnResult:alertCallbackId args:@"Recording Done.",nil]; } // Call this function when you have results to send back to javascript callbacks // callbackId : int comes from handleCall function // args: list of objects to send to the javascript callback - (void)returnResult:(int)callbackId args:(id)arg, ...; { if (callbackId==0) return; va_list argsList; NSMutableArray *resultArray = [[NSMutableArray alloc] init]; if(arg != nil){ [resultArray addObject:arg]; va_start(argsList, arg); while((arg = va_arg(argsList, id)) != nil) [resultArray addObject:arg]; va_end(argsList); } NSString *resultArrayString = [json stringWithObject:resultArray allowScalar:YES error:nil]; [self performSelectorOnMainThread:@selector(stringByEvaluatingJavaScriptFromString:) withObject:[NSString stringWithFormat:@"NativeBridge.resultForCallback(%d,%@);",callbackId,resultArrayString] waitUntilDone:NO]; [resultArray release]; } 

Step 3: Now it's time to go back to javascript inside the UIWebView, and we recorded the audio so that you can start receiving data in blocks from us. I use websockets to pass data back to javascript. The data will be transferred in blocks because the server ( https://github.com/benlodotcom/BLWebSocketsServer ) that I used was built using libwebsockets ( http://git.warmcat.com/cgi-bin/cgit/libwebsockets/ ) .

This is how you start the server in the delegate class.

 - (id)initWithFrame:(CGRect)frame { if (self = [super initWithFrame:frame]) { [self _createServer]; [self.server start]; myContents = [NSData data]; // Set delegate in order to "shouldStartLoadWithRequest" to be called self.delegate = self; // Set non-opaque in order to make "body{background-color:transparent}" working! self.opaque = NO; // Instanciate JSON parser library json = [ SBJSON new ]; // load our html file NSString *path = [[NSBundle mainBundle] pathForResource:@"webview-document" ofType:@"html"]; [self loadRequest:[NSURLRequest requestWithURL:[NSURL fileURLWithPath:path]]]; } return self; } -(void) _createServer { /*Create a simple echo server*/ self.server = [[BLWebSocketsServer alloc] initWithPort:9000 andProtocolName:echoProtocol]; [self.server setHandleRequestBlock:^NSData *(NSData *data) { NSString *convertedString = [[NSString alloc] initWithData:data encoding:NSUTF8StringEncoding]; NSLog(@"Received Request...%@",convertedString); if([convertedString isEqualToString:@"start"]) { NSLog(@"myContents size: %d",[myContents length]); int contentSize = [myContents length]; int chunkSize = 64*1023; chunksCount = ([myContents length]/(64*1023))+1; NSLog(@"ChunkSize=%d",chunkSize); NSLog(@"chunksCount=%d",chunksCount); chunksArray = [[NSMutableArray array] retain]; int index = 0; //NSRange chunkRange; for(int i=1;i<=chunksCount;i++) { if(i==chunksCount) { NSRange chunkRange = {index,contentSize-index}; NSLog(@"chunk# = %d, chunkRange=(%d,%d)",i,index,contentSize-index); NSData *dataChunk = [myContents subdataWithRange:chunkRange]; [chunksArray addObject:dataChunk]; break; } else { NSRange chunkRange = {index, chunkSize}; NSLog(@"chunk# = %d, chunkRange=(%d,%d)",i,index,chunkSize); NSData *dataChunk = [myContents subdataWithRange:chunkRange]; index += chunkSize; [chunksArray addObject:dataChunk]; } } return [chunksArray objectAtIndex:0]; } else { int chunkNumber = [convertedString intValue]; if(chunkNumber>0 && (chunkNumber+1)<=chunksCount) { return [chunksArray objectAtIndex:(chunkNumber)]; } } NSLog(@"Releasing Array"); [chunksArray release]; chunksCount = 0; return [NSData dataWithBase64EncodedString:@"Stop"]; }]; } 

javascript side code

 var socket; var chunkCount = 0; var soundBlob, soundUrl; var smallBlobs = new Array(); function captureMovieCallback(response) { if(socket) { try{ socket.send('start'); } catch(e) { log('Socket is not valid object'); } } else { log('socket is null'); } } function closeSocket(response) { socket.close(); } function connect(){ try{ window.WebSocket = window.WebSocket || window.MozWebSocket; socket = new WebSocket('ws://127.0.0.1:9000', 'echo-protocol'); socket.onopen = function(){ } socket.onmessage = function(e){ var data = e.data; if(e.data instanceof ArrayBuffer) { log('its arrayBuffer'); } else if(e.data instanceof Blob) { if(soundBlob) log('its Blob of size = '+ e.data.size + ' final blob size:'+ soundBlob.size); if(e.data.size != 3) { //log('its Blob of size = '+ e.data.size); smallBlobs[chunkCount]= e.data; chunkCount = chunkCount +1; socket.send(''+chunkCount); } else { //alert('End Received'); try{ soundBlob = new Blob(smallBlobs,{ "type" : "audio/wav" }); var myURL = window.URL || window.webkitURL; soundUrl = myURL.createObjectURL(soundBlob); log('soundURL='+soundUrl); } catch(e) { log('Problem creating blob and url.'); } try{ var serverUrl = 'http://10.44.45.74:8080/MyTestProject/WebRecording?record'; var xhr = new XMLHttpRequest(); xhr.open('POST',serverUrl,true); xhr.setRequestHeader("content-type","multipart/form-data"); xhr.send(soundBlob); } catch(e) { log('error uploading blob file'); } socket.close(); } //alert(JSON.stringify(msg, null, 4)); } else { log('dont know'); } } socket.onclose = function(){ //message('<p class="event">Socket Status: '+socket.readyState+' (Closed)'); log('final blob size:'+soundBlob.size); } } catch(exception){ log('<p>Error: '+exception); } } function log(msg) { NativeBridge.log(msg); } function stopCapture() { NativeBridge.call("stopMovie", null,null); } function startCapture() { NativeBridge.call("captureMovie",null,captureMovieCallback); } 

Nativebridge.js

 var NativeBridge = { callbacksCount : 1, callbacks : {}, // Automatically called by native layer when a result is available resultForCallback : function resultForCallback(callbackId, resultArray) { try { var callback = NativeBridge.callbacks[callbackId]; if (!callback) return; console.log("calling callback for "+callbackId); callback.apply(null,resultArray); } catch(e) {alert(e)} }, // Use this in javascript to request native objective-c code // functionName : string (I think the name is explicit :p) // args : array of arguments // callback : function with n-arguments that is going to be called when the native code returned call : function call(functionName, args, callback) { //alert("call"); //alert('callback='+callback); var hasCallback = callback && typeof callback == "function"; var callbackId = hasCallback ? NativeBridge.callbacksCount++ : 0; if (hasCallback) NativeBridge.callbacks[callbackId] = callback; var iframe = document.createElement("IFRAME"); iframe.setAttribute("src", "js-frame:" + functionName + ":" + callbackId+ ":" + encodeURIComponent(JSON.stringify(args))); document.documentElement.appendChild(iframe); iframe.parentNode.removeChild(iframe); iframe = null; }, log : function log(message) { var iframe = document.createElement("IFRAME"); iframe.setAttribute("src", "ios-log:"+encodeURIComponent(JSON.stringify("#iOS#" + message))); document.documentElement.appendChild(iframe); iframe.parentNode.removeChild(iframe); iframe = null; } }; 
  • we call connect () on the JavaScript side to load the body on the html side

  • As soon as we get a callback (captureMovieCallback) from the startCapture function, we send the beginning of a message indicating that we are ready to accept the data.

  • the server on the objective side c separates the wav sound data in small chunksize = 60 * 1023 fragments and stores it in an array.

  • sends the first block back to the JavaScript side.

  • javascript takes this block and sends the next block number that it needs from the server.

  • the server sends the block indicated by this number. This process repeats until we send the last block in javascript.

  • In the latter case, we send a stop message back to the JavaScript side, indicating that we are done. This is apparently 3 bytes (which is used as a criterion to break this loop.)

  • Each block is stored as a small block in an array. Now we create large drops from these small drops using the following line

    soundBlob = new Blob (smallBlobs, {"type": "audio / wav"});

    This blob is uploaded to a server that writes this blob as a wav file. we can pass the url to this wav file as an src audio tag to play it on the javascript side.

  • we close the connection with websocket after sending the blob to the server.

    Hope this is clear enough to understand.

+2
source share

If all you want to do is play the sound, than you would be better off using one of the built-in sound systems in iOS and not in the HTML tag.

0
source share

All Articles