Quantcast
Channel: Recent Questions - Stack Overflow
Viewing all articles
Browse latest Browse all 11781

Connect Flutter just_audio to Amazon Polly

$
0
0

I am building a Flutter app which uses the just_audio library to play short voice prompts.

If I generate an mp3 file manually using Amazon Polly Text to Speech in the AWS console and then upload the resultant mp3 file to an S3 bucket, (after granting public access and copying the S3 object URL) then just_audio downloads and plays the audio correctly.

// Dartfinal AudioPlayer audioPlayer = AudioPlayer();audioPlayer.setUrl('https://bucketname.s3.eu-west-2.amazonaws.com/say_someting.mp3');audioPlayer.play();

However, I need the audio to be generated from text on-the-fly allowing voice prompts whose content is unknown until runtime. Hence I have also built a Java Spring Boot microservice which exposes an endpoint which calls Amazon Polly. Streaming is not necessary given the duration of the audio is less than about three seconds and the mp3 data about 15KB in each case.

// Java@GetMapping(path = "/mp3", produces = "audio/mpeg")public byte[] getAudio(){// Some piece of runtime textfinal String whatToSay = "The time is now " + System.currentTimeMillis();// Create a request for Polly to turn the text to audiofinal SynthesizeSpeechRequest synthesizeSpeechRequest = new SynthesizeSpeechRequest().withOutputFormat(OutputFormat.Mp3).withVoiceId(VoiceId.Amy).withText(whatToSay);// Use SSML for intonation control// synthesizeSpeechRequest.setTextType(TextType.Ssml);final SynthesizeSpeechResult synthesizeSpeechResult = amazonPolly.synthesizeSpeech(synthesizeSpeechRequest);// The content it typically just a few kilobytes, so we can just collect it and send it as an array of bytes final ByteArrayOutputStream outputStream = new ByteArrayOutputStream();IOUtils.copy(synthesizeSpeechResult.getAudioStream(), outputStream);return outputStream.toByteArray();}

If I deploy this microservice to Elastic Beanstalk and access the endpoint with a browser then the browser plays the audio correctly. If I download the mp3 from this endpoint then the mp3 file can be played from Quicktime. So there does not appear to be anything wrong server-side.

// Dartfinal AudioPlayer audioPlayer = AudioPlayer();audioPlayer.setUrl('https://my-domain/mp3');audioPlayer.play();

A problem occurs when I point the just_audio player to the endpoint. An exception is thrown with the message ‘source error’.

How can I get my Spring Boot app and the just_audio player to work together?

I have tried stepping through the just_audio source to see if I can find out any more detail as to the cause of the problem, but the trail goes cold as the error originates in async code and is attached to a Future.

I have tried including an empty header map while setting the url because I can see just_audio checks for this.

I have tried the flutter_tts package but I cannot find any support for controlling the emphasis of given words in a sentence. (Amazon Polly supports SSML and hence offers fine control of the intonation).

I have tried streaming the output of Amazon Polly to the Response stream instead of buffering the audio data.

I have set up a (temporary) URL which demonstrates the problem here:enter link description herehttp://pollyjustaudio-env.eba-cn2msugj.eu-west-2.elasticbeanstalk.com/audio/mp3


Viewing all articles
Browse latest Browse all 11781

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>