Skip to content

fix: App freezing #201

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
TimeLord2010 opened this issue Mar 12, 2025 · 1 comment
Open

fix: App freezing #201

TimeLord2010 opened this issue Mar 12, 2025 · 1 comment

Comments

@TimeLord2010
Copy link

Description

When in buffering mode, the app crashed after some time after receiving some data. I have a strong belief that this was caused by feeding the audio data to the player while or before a buffer stream was created.

I was struggling with this issue for about two weeks, tried different approaches, and even other packages (turns out, yours is actually the best for my case). The app started to freeze suddely without any notice or error message, even when if I moved the player code to an isolate.

Here is my player class:

import 'dart:async';
import 'dart:typed_data';

import 'package:flutter_soloud/flutter_soloud.dart';
import 'package:vit_gpt_flutter_api/data/contracts/realtime_audio_player.dart';

class AlternativeRealtimePlayer with RealtimeAudioPlayer {
  final _player = SoLoud.instance;
  AudioSource? _source;

  bool _isPlaying = false;

  Completer? _completer;

  @override
  Future<void> appendBytes(Uint8List audioData) async {
    await _completer?.future;
    _player.addAudioDataStream(_source!, audioData);
    if (!_isPlaying) {
      _isPlaying = true;
      await _player.play(_source!);
    }
  }

  @override
  Future<void> createBufferStream() async {
    var c = _completer = Completer();
    if (!_player.isInitialized) {
      await _player.init(
        automaticCleanup: true,
        channels: Channels.mono,
        sampleRate: 24000,
      );
    }

    await _player.disposeAllSources();

    _source = _player.setBufferStream(
      channels: Channels.mono,
      sampleRate: 24000,
      format: BufferType.s16le,
      bufferingType: BufferingType.released,
    );
    c.complete();
  }

  @override
  void dispose() {
    _player.disposeAllSources();
  }

  @override
  Future<void> disposeBufferStream() async {
    _isPlaying = false;
  }
}

After I introduced the Completer to prevent data from being sent to the player before the buffer was created, the issue was no longer seem. But it would be awesome if a error message warned the dev about this.

One other possible cause that I did not test, was initializing the player with the no parameters for channels and sample rate. All I did was add the Completer and the parameters to the init function also and it worked, so what solved it could be any of these.

Additional Context

Platform: IOS 18.

@alnitak
Copy link
Owner

alnitak commented Mar 13, 2025

Hi @TimeLord2010,

I tried your code with your vit_gpt_flutter_api package, but only on the simulator. With or without the completer, the code worked as expected.

To feed the buffer I used random numbers.

the code I used:
import 'dart:async';
import 'dart:developer' as dev;
import 'dart:math';
import 'package:flutter/foundation.dart';
import 'package:flutter/material.dart';
import 'package:flutter_soloud/flutter_soloud.dart';
import 'package:logging/logging.dart';
import 'package:vit_gpt_flutter_api/data/contracts/realtime_audio_player.dart';

void main() async {
  // The `flutter_soloud` package logs everything
  // (from severe warnings to fine debug messages)
  // using the standard `package:logging`.
  // You can listen to the logs as shown below.
  Logger.root.level = kDebugMode ? Level.FINE : Level.INFO;
  Logger.root.onRecord.listen((record) {
    dev.log(
      record.message,
      time: record.time,
      level: record.level.value,
      name: record.loggerName,
      zone: record.zone,
      error: record.error,
      stackTrace: record.stackTrace,
    );
  });

  WidgetsFlutterBinding.ensureInitialized();

  /// Initialize the player.
  await SoLoud.instance.init(sampleRate: 24000);

  runApp(
    const MaterialApp(
      home: Scaffold(body: MyApp()),
    ),
  );
}

class MyApp extends StatefulWidget {
  const MyApp({super.key});

  @override
  State<MyApp> createState() => _MyAppState();
}

class _MyAppState extends State<MyApp> {
  final _player = AlternativeRealtimePlayer();
  late final Uint8List audioData;
  Timer? _timer;

  @override
  void initState() {
    super.initState();

    audioData = Uint8List(32768);
    for (var i = 0; i < audioData.length; i++) {
      audioData[i] = Random().nextInt(256);
    }
  }

  @override
  Widget build(BuildContext context) {
    return Center(
      child: Column(
        mainAxisSize: MainAxisSize.min,
        children: [
          OutlinedButton(
            onPressed: () async {
              await _player.createBufferStream();
              await _player.appendBytes(audioData);
            },
            child: const Text('setup'),
          ),
          OutlinedButton(
            onPressed: () async {
              _timer = Timer.periodic(
                const Duration(milliseconds: 500),
                (callback) => _player.appendBytes(audioData),
              );
            },
            child: const Text('addData'),
          ),
          OutlinedButton(
            onPressed: () async {
              _timer?.cancel();
              await _player.disposeBufferStream();
              _player.dispose();
            },
            child: const Text('dispose'),
          ),
        ],
      ),
    );
  }
}

class AlternativeRealtimePlayer with RealtimeAudioPlayer {
  final _player = SoLoud.instance;
  AudioSource? _source;
  bool _isPlaying = false;

  @override
  Future<void> appendBytes(Uint8List audioData) async {
    if (_source == null) {
      return;
    }
    _player.addAudioDataStream(_source!, audioData);
    if (!_isPlaying) {
      _isPlaying = true;
      await _player.play(_source!);
    }
  }

  @override
  Future<void> createBufferStream() async {
    if (!_player.isInitialized) {
      await _player.init(
        automaticCleanup: true,
        channels: Channels.mono,
        sampleRate: 24000,
      );
    }

    await _player.disposeAllSources();

    _source = _player.setBufferStream(
      channels: Channels.mono,
      sampleRate: 24000,
      format: BufferType.s16le,
      bufferingType: BufferingType.released,
      onBuffering: (isBuffering, handle, time) {
        print('isBuffering: $isBuffering, handle: $handle, time: $time');
      },
    );
  }

  @override
  void dispose() {
    _player.disposeAllSources();
    _source = null;
  }

  @override
  Future<void> disposeBufferStream() async {
    _isPlaying = false;
  }
}

Also, the addAudioDataStream should throw if the player has not been initialized or the AudioSource doesn't exist.

I have no clues about this issue. Could you provide a minimal, complete project without third-party packages to reproduce the issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants