Skip to content
Merged
Show file tree
Hide file tree
Changes from 9 commits
Commits
Show all changes
22 commits
Select commit Hold shift + click to select a range
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion packages/camera/camera_avfoundation/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
## NEXT
## 0.9.17+4

* Fixes overwriting flag MixWithOthers set by video_player.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it's going to be clear to a client of camera what this means; please add more context about what the actual problem being solved is. (Also, it's fine to give video_player as an example, but it shouldn't make it sound like this is specific to video_player; any other native code could collide.)

* Updates minimum supported SDK version to Flutter 3.19/Dart 3.3.

## 0.9.17+3
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -338,4 +338,27 @@ - (void)testStartWritingShouldNotBeCalledBetweenSampleCreationAndAppending {
CFRelease(videoSample);
}

- (void)testStartVideoRecordingWithCompletionShouldNotDisableMixWithOthers {
FLTCam *cam = FLTCreateCamWithCaptureSessionQueue(dispatch_queue_create("testing", NULL));

id writerMock = OCMClassMock([AVAssetWriter class]);
OCMStub([writerMock alloc]).andReturn(writerMock);
OCMStub([writerMock initWithURL:OCMOCK_ANY fileType:OCMOCK_ANY error:[OCMArg setTo:nil]])
.andReturn(writerMock);

[AVAudioSession.sharedInstance setCategory:AVAudioSessionCategoryPlayback
withOptions:AVAudioSessionCategoryOptionMixWithOthers
error:nil];

[cam
startVideoRecordingWithCompletion:^(FlutterError *_Nullable error) {
}
messengerForStreaming:nil];
XCTAssert(
AVAudioSession.sharedInstance.categoryOptions & AVAudioSessionCategoryOptionMixWithOthers,
@"Flag MixWithOthers was removed.");
XCTAssert(AVAudioSession.sharedInstance.category == AVAudioSessionCategoryPlayAndRecord,
@"Category should be PlayAndRecord.");
}

@end
Original file line number Diff line number Diff line change
Expand Up @@ -227,7 +227,7 @@ - (void)prepareForVideoRecordingWithCompletion:
(nonnull void (^)(FlutterError *_Nullable))completion {
__weak typeof(self) weakSelf = self;
dispatch_async(self.captureSessionQueue, ^{
[weakSelf.camera setUpCaptureSessionForAudio];
[weakSelf.camera setUpCaptureSessionForAudioIfNeeded];
completion(nil);
});
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -184,6 +184,8 @@ - (instancetype)initWithMediaSettings:(FCPPlatformMediaSettings *)mediaSettings
_videoFormat = kCVPixelFormatType_32BGRA;
_inProgressSavePhotoDelegates = [NSMutableDictionary dictionary];
_fileFormat = FCPPlatformImageFileFormatJpeg;
_videoCaptureSession.automaticallyConfiguresApplicationAudioSession = NO;
_audioCaptureSession.automaticallyConfiguresApplicationAudioSession = NO;

// To limit memory consumption, limit the number of frames pending processing.
// After some testing, 4 was determined to be the best maximum value.
Expand Down Expand Up @@ -673,7 +675,8 @@ - (void)captureOutput:(AVCaptureOutput *)output
if (_isFirstVideoSample) {
[_videoWriter startSessionAtSourceTime:currentSampleTime];
// fix sample times not being numeric when pause/resume happens before first sample buffer
// arrives https://github.com/flutter/flutter/issues/132014
// arrives
// https://github.com/flutter/flutter/issues/132014
_lastVideoSampleTime = currentSampleTime;
_lastAudioSampleTime = currentSampleTime;
_isFirstVideoSample = NO;
Expand Down Expand Up @@ -1231,9 +1234,7 @@ - (BOOL)setupWriterForPath:(NSString *)path {
return NO;
}

if (_mediaSettings.enableAudio && !_isAudioSetup) {
[self setUpCaptureSessionForAudio];
}
[self setUpCaptureSessionForAudioIfNeeded];

_videoWriter = [[AVAssetWriter alloc] initWithURL:outputURL
fileType:AVFileTypeMPEG4
Expand Down Expand Up @@ -1313,9 +1314,45 @@ - (BOOL)setupWriterForPath:(NSString *)path {
return YES;
}

- (void)setUpCaptureSessionForAudio {
// this same function is also in video_player_avfoundation
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Comments should be properly formatted sentences.

https://google.github.io/styleguide/objcguide.html#comments

// configure application wide audio session manually to prevent overwriting
// flag MixWithOthers by capture session, only change category if it is considered
// as upgrade which means it can only enable ability to play in silent mode or
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

enable ability to play in silent mode

is it desired behavior?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

System itself is doing this at some point at or after AVCaptureSession addInput or addOutput when automaticallyConfiguresApplicationAudioSession is YES which is default, this behaviour was there also before.

// ability to record audio but never disables it, that could affect other plugins
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

affect other plugins

can you explicitly mention video player as an example?

// which depend on this global state, only change category or options if there is
// change to prevent unnecessary lags and silence
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It's not clear to me how this last part is different from what the comment already said so far (possibly because the sentence is a run-on by this point so it's hard to follow). Please reword to clarify if this is supposed to be adding new details.

// https://github.com/flutter/flutter/issues/131553
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This doesn't need an issue link; we generally only link to issues for TODOs, or cases that are not understandable from a comment (e.g., very subtle edge cases). The idea of only upgrading global mode is clear enough.

static void upgradeAudioSessionCategory(AVAudioSessionCategory category,
AVAudioSessionCategoryOptions options,
AVAudioSessionCategoryOptions clearOptions) {
if (category == nil) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you explain what category means here? is it the category that we want to upgrade to? or the category that we want to upgrade from?

Copy link
Contributor Author

@misos1 misos1 Aug 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually it is a category with which we need to combine an existing category. Newly set category should be such a combination.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you rename this to requestedCategory? Also when will we need to pass a nil category param? What does it mean by requesting a nil category? and why are we assigning category = AVAudioSession.sharedInstance.category when it's nil

category = AVAudioSession.sharedInstance.category;
}
NSSet *playCategories = [NSSet
setWithObjects:AVAudioSessionCategoryPlayback, AVAudioSessionCategoryPlayAndRecord, nil];
NSSet *recordCategories =
[NSSet setWithObjects:AVAudioSessionCategoryRecord, AVAudioSessionCategoryPlayAndRecord, nil];
NSSet *categories = [NSSet setWithObjects:category, AVAudioSession.sharedInstance.category, nil];
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what does this categories mean? why are we combining the set? Can you add some comments to explain?

Copy link
Contributor Author

@misos1 misos1 Aug 20, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Categories which should be combined. Upgrading means we need to combine categories so the new category has play flavour if any of them has it, and has record if any of them has it. Test whether any of these categories has play is done by intersection which means whether anything in the first set is present in the second set.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you rename this to requiredFinalCategories or something similar

BOOL needPlay = [categories intersectsSet:playCategories];
BOOL needRecord = [categories intersectsSet:recordCategories];
if (needPlay && needRecord) {
category = AVAudioSessionCategoryPlayAndRecord;
} else if (needPlay) {
category = AVAudioSessionCategoryPlayback;
} else if (needRecord) {
category = AVAudioSessionCategoryRecord;
}
options = (AVAudioSession.sharedInstance.categoryOptions & ~clearOptions) | options;
if ([category isEqualToString:AVAudioSession.sharedInstance.category] &&
options == AVAudioSession.sharedInstance.categoryOptions) {
return;
}
[AVAudioSession.sharedInstance setCategory:category withOptions:options error:nil];
}

- (void)setUpCaptureSessionForAudioIfNeeded {
// Don't setup audio twice or we will lose the audio.
if (_isAudioSetup) {
if (!_mediaSettings.enableAudio || _isAudioSetup) {
return;
}

Expand All @@ -1331,6 +1368,19 @@ - (void)setUpCaptureSessionForAudio {
// Setup the audio output.
_audioOutput = [[AVCaptureAudioDataOutput alloc] init];

dispatch_block_t block = ^{
upgradeAudioSessionCategory(AVAudioSessionCategoryPlayAndRecord,
AVAudioSessionCategoryOptionDefaultToSpeaker |
AVAudioSessionCategoryOptionAllowBluetoothA2DP |
AVAudioSessionCategoryOptionAllowAirPlay,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you explain why these options?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

DefaultToSpeaker is what it was setting also before with automatic session configuration, so better to not change that behaviour. It is also implicit default for AVAudioSessionCategoryPlayback along with AllowBluetoothA2DP and AllowAirPlay which cannot be turned off (it is not set by default for PlayAndRecord). This category uses the video player so it is better to not change player behaviour by just initializing the camera (with audio).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you add the comments explaining this?

0);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I still think it's better not to pass this param. It's making the function a bit complicated to read.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ideally that function would be in a separate file shared with video_player. Is it possible to do that?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Or you mean that better would be to have it as an objc selector rather than C function?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think it's possible to share video player and camera code. Our plugins are not setup to do so.

I meant it seems we only pass 0 and never other values to this param. So do we really need this param?

};
if (!NSThread.isMainThread) {
dispatch_sync(dispatch_get_main_queue(), block);
} else {
block();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is setUpCaptureSessionForAudioIfNeeded already guaranteed to be on background? Can you double check the caller of this function? If it's already on background, we can simply do

NSAssert(!NSThread.isMainThread);
dispatch_sync(dispatch_get_main_queue()) {
  // your logic here
};

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is called on the main thread in tests if I remember correctly #7143 (comment).

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it only called on main thread in tests? Can you give an example of such failure? I'd like to see if it's possible to run on background thread in those tests

Copy link
Contributor Author

@misos1 misos1 Nov 29, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Try to run tests with this:

  //if (!NSThread.isMainThread) {
    dispatch_sync(dispatch_get_main_queue(), block);
  //} else {
    //block();
  //}

And you will get this inside xcode:

Thread 1: EXC_BREAKPOINT (code=1, subcode=0x18082dc74)

When running outside of xcode with debugging there will be probably just crash, you can add NSLog(@"%@", NSThread.currentThread); to see that it runs on main thread <_NSMainThread: 0x2804b0080>{number = 1, name = main}.

You will probably need to run flutter pub upgrade first as without that I am getting this error during flutter run (win32 on macos?):

Could not build the precompiled application for the device.
Error (Xcode): ../../../../../../../../.pub-cache/hosted/pub.dev/win32-5.2.0/lib/src/guid.dart:32:9: Error: Type 'UnmodifiableUint8ListView' not found.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is called on the main thread in tests if I remember correctly #7143 (comment).

When running outside of xcode with debugging there will be probably just crash, you can add NSLog(@"%@", NSThread.currentThread); to see that it runs on main thread <_NSMainThread: 0x2804b0080>{number = 1, name = main}.

So setUpCaptureSessionForAudioIfNeeded is called on main thread somewhere (not just in test). Could you find the place it's called on main? The session setup shouldn't happen on main thread.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

But that same situation is also with FLTEnsureToRunOnMainQueue.

Could you explain the problem with FLTEnsureToRunOnMainQueue? Maybe a concrete example?

How exactly is this a problem anyway? There are many other "adaptations" only for tests, like interfaces and factories just to simplify testing, in both camera and video_player.

I don't know why you are comparing with interfaces and factories.

Imagine you have this code:

func setupSession() {
  if is background thread {
    Logic A
  } else {
    Logic B
  }
}

Here your production code only calls Logic A, and test code only calls Logic B. Then the test isn't really providing any confidence right?

What I suggest is:

func setupSession() { 
  Assert(session must be setup on background);
  Logic A;
}

Then in the test, you can do:

func testSetupSession() {
  dispatch_to_background {
    setupSession();
  }
}

This way, the test code actually covers Logic A, providing us with confidence in production.

Copy link
Contributor Author

@misos1 misos1 Dec 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Your assertion will fail because other tests call setupSession on the main thread. And would this not cause a deadlock due to dispatch_sync waiting for block dispatched on main queue while also waitForExpectation is waiting on main thread until dispatch_sync is done and expectation fulfilled? discussion_r1876564910

testFoo() {
  let expectation = ...
  sessionQueue.async {
    dispatch_sync(main_queue(), ...)
    expectation.fulfill()
  }
  waitForExpectation()
}

What I proposed does not have such problems and tests also cover production logic as I wrote.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

And would this not cause a deadlock due to dispatch_sync waiting for block dispatched on main queue while also waitForExpectation is waiting on main thread until dispatch_sync is done and expectation fulfilled?

Oh I overlooked the part that it's dispatch_sync to main, which leads to deadlock in test.

What I proposed does not have such problems and tests also cover production logic as I wrote.

It would be the same problem, that dispatch_sync(main) is inherently not testable - You simply moved the same problem to FLTEnsureToRunOnMainQueueSync (i.e. the dispatch_sync branch of this helper would not be testable for the same reason). So I'd accept the current code as it is.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It would be the same problem

Oh I did not think of that, but as you wrote that, I realized that means that also the async test should be untestable, maybe it actually waits in some non blocking way. But anyway I did not see that documented anywhere, maybe that async test is not 100% clean then?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I realized that means that also the async test should be untestable

dispatch_async is different and it can be tested. The problem with dispatch_sync is that it causes deadlock when dispatching to the same queue.

}

if ([_audioCaptureSession canAddInput:audioInput]) {
[_audioCaptureSession addInput:audioInput];

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -113,7 +113,7 @@ NS_ASSUME_NONNULL_BEGIN
- (void)startImageStreamWithMessenger:(NSObject<FlutterBinaryMessenger> *)messenger;
- (void)stopImageStream;
- (void)setZoomLevel:(CGFloat)zoom withCompletion:(void (^)(FlutterError *_Nullable))completion;
- (void)setUpCaptureSessionForAudio;
- (void)setUpCaptureSessionForAudioIfNeeded;
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this called outside the implementation? If not it should be moved to the private category in the .m file.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It is also called from prepareForVideoRecordingWithCompletion in CameraPlugin.m.


@end

Expand Down
2 changes: 1 addition & 1 deletion packages/camera/camera_avfoundation/pubspec.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ name: camera_avfoundation
description: iOS implementation of the camera plugin.
repository: https://github.com/flutter/packages/tree/main/packages/camera/camera_avfoundation
issue_tracker: https://github.com/flutter/flutter/issues?q=is%3Aissue+is%3Aopen+label%3A%22p%3A+camera%22
version: 0.9.17+3
version: 0.9.17+4

environment:
sdk: ^3.3.0
Expand Down
Original file line number Diff line number Diff line change
@@ -1,5 +1,6 @@
## NEXT
## 2.6.2

* Fixes audio recorded only with first recording.
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This description doesn't make sense in the context of this package, which does not record anything. Please describe the fix in terms of what is changing generally, not the specific effect it has on a specific test app that readers of the changelog won't have context on.

* Updates minimum supported SDK version to Flutter 3.19/Dart 3.3.

## 2.6.1
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -791,6 +791,40 @@ - (void)testPublishesInRegistration {
}

#if TARGET_OS_IOS
- (void)testVideoPlayerShouldNotOverwritePlayAndRecordNorDefaultToSpeaker {
NSObject<FlutterPluginRegistrar> *registrar = [GetPluginRegistry()
registrarForPlugin:@"testVideoPlayerShouldNotOverwritePlayAndRecordNorDefaultToSpeaker"];
FVPVideoPlayerPlugin *videoPlayerPlugin =
[[FVPVideoPlayerPlugin alloc] initWithRegistrar:registrar];
FlutterError *error;

[AVAudioSession.sharedInstance setCategory:AVAudioSessionCategoryPlayAndRecord
withOptions:AVAudioSessionCategoryOptionDefaultToSpeaker
error:nil];

[videoPlayerPlugin initialize:&error];
[videoPlayerPlugin setMixWithOthers:true error:&error];
XCTAssert(AVAudioSession.sharedInstance.category == AVAudioSessionCategoryPlayAndRecord,
@"Category should be PlayAndRecord.");
XCTAssert(
AVAudioSession.sharedInstance.categoryOptions & AVAudioSessionCategoryOptionDefaultToSpeaker,
@"Flag DefaultToSpeaker was removed.");
XCTAssert(
AVAudioSession.sharedInstance.categoryOptions & AVAudioSessionCategoryOptionMixWithOthers,
@"Flag MixWithOthers should be set.");

id sessionMock = OCMClassMock([AVAudioSession class]);
OCMStub([sessionMock sharedInstance]).andReturn(sessionMock);
OCMStub([sessionMock category]).andReturn(AVAudioSessionCategoryPlayAndRecord);
OCMStub([sessionMock categoryOptions])
.andReturn(AVAudioSessionCategoryOptionMixWithOthers |
AVAudioSessionCategoryOptionDefaultToSpeaker);
OCMReject([sessionMock setCategory:OCMOCK_ANY withOptions:0 error:[OCMArg setTo:nil]])
.ignoringNonObjectArgs();

[videoPlayerPlugin setMixWithOthers:true error:&error];
}

- (void)validateTransformFixForOrientation:(UIImageOrientation)orientation {
AVAssetTrack *track = [[FakeAVAssetTrack alloc] initWithOrientation:orientation];
CGAffineTransform t = FVPGetStandardizedTransformForTrack(track);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -708,7 +708,7 @@ - (int64_t)onPlayerSetup:(FVPVideoPlayer *)player frameUpdater:(FVPFrameUpdater
- (void)initialize:(FlutterError *__autoreleasing *)error {
#if TARGET_OS_IOS
// Allow audio playback when the Ring/Silent switch is set to silent
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
upgradeAudioSessionCategory(AVAudioSessionCategoryPlayback, 0, 0);
#endif

[self.playersByTextureId
Expand Down Expand Up @@ -813,17 +813,53 @@ - (void)pausePlayer:(NSInteger)textureId error:(FlutterError **)error {
[player pause];
}

// this same function is also in camera_avfoundation
// do not overwrite PlayAndRecord with Playback which causes inability to record
// audio, do not overwrite all options, only change category if it is considered
// as upgrade which means it can only enable ability to play in silent mode or
// ability to record audio but never disables it, that could affect other plugins
// which depend on this global state, only change category or options if there is
// change to prevent unnecessary lags and silence
// https://github.com/flutter/flutter/issues/131553
#if TARGET_OS_IOS
static void upgradeAudioSessionCategory(AVAudioSessionCategory category,
AVAudioSessionCategoryOptions options,
AVAudioSessionCategoryOptions clearOptions) {
if (category == nil) {
category = AVAudioSession.sharedInstance.category;
}
NSSet *playCategories = [NSSet
setWithObjects:AVAudioSessionCategoryPlayback, AVAudioSessionCategoryPlayAndRecord, nil];
NSSet *recordCategories =
[NSSet setWithObjects:AVAudioSessionCategoryRecord, AVAudioSessionCategoryPlayAndRecord, nil];
NSSet *categories = [NSSet setWithObjects:category, AVAudioSession.sharedInstance.category, nil];
BOOL needPlay = [categories intersectsSet:playCategories];
BOOL needRecord = [categories intersectsSet:recordCategories];
if (needPlay && needRecord) {
category = AVAudioSessionCategoryPlayAndRecord;
} else if (needPlay) {
category = AVAudioSessionCategoryPlayback;
} else if (needRecord) {
category = AVAudioSessionCategoryRecord;
}
options = (AVAudioSession.sharedInstance.categoryOptions & ~clearOptions) | options;
if ([category isEqualToString:AVAudioSession.sharedInstance.category] &&
options == AVAudioSession.sharedInstance.categoryOptions) {
return;
}
[AVAudioSession.sharedInstance setCategory:category withOptions:options error:nil];
}
#endif

- (void)setMixWithOthers:(BOOL)mixWithOthers
error:(FlutterError *_Nullable __autoreleasing *)error {
#if TARGET_OS_OSX
// AVAudioSession doesn't exist on macOS, and audio always mixes, so just no-op.
#else
if (mixWithOthers) {
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback
withOptions:AVAudioSessionCategoryOptionMixWithOthers
error:nil];
upgradeAudioSessionCategory(nil, AVAudioSessionCategoryOptionMixWithOthers, 0);
} else {
[[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:nil];
upgradeAudioSessionCategory(nil, 0, AVAudioSessionCategoryOptionMixWithOthers);
}
#endif
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ name: video_player_avfoundation
description: iOS and macOS implementation of the video_player plugin.
repository: https://github.com/flutter/packages/tree/main/packages/video_player/video_player_avfoundation
issue_tracker: https://github.com/flutter/flutter/issues?q=is%3Aissue+is%3Aopen+label%3A%22p%3A+video_player%22
version: 2.6.1
version: 2.6.2

environment:
sdk: ^3.3.0
Expand Down