Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

live view not working on iPhone #121

Closed
IronOxidizer opened this issue Apr 11, 2021 · 20 comments
Closed

live view not working on iPhone #121

IronOxidizer opened this issue Apr 11, 2021 · 20 comments
Labels
Milestone

Comments

@IronOxidizer
Copy link
Contributor

IronOxidizer commented Apr 11, 2021

Describe the bug
An error page is displayed after selecting a live view stream.

e@http://192.168.1.64:8080/static/js/main.7d8533f5.chunk.js:1:29092
http://192.168.1.64:8080/static/js/main.7d8533f5.chunk.js:1:29576
Al@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:348325
http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:366138
bl@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:339244
bl@[native code]
http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:288719
http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:366138
qa@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:288665
Ya@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:288600
Ne@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:359949
Qt@http://192.168.1.64:8080/static/js/2.044b71a1.chunk.js:2:267224
Qt@[native code]

Haven't been able to reproduce the error on Windows, Linux, or Android with any browser.

To Reproduce
Steps to reproduce the behavior:

  1. Go to web UI on iOS
  2. Click on Live View
  3. Select camera stream
  4. See error

Expected behavior
3. Select camera
4. Live view of camera appears

Server (please complete the following information):

  • docker latest
  • logs have no anomalies

Screenshots
image

Smartphone

  • Device: iPhone X
  • OS: iOS 14.4
  • Browser: safari, chrome
@scottlamb scottlamb added this to the 1.0 milestone Apr 12, 2021
@scottlamb
Copy link
Owner

Do you know if it's possible to enable source maps to get a more useful backtrace? I don't have an iPhone or know how to debug on them. I do have a Mac laptop; there's probably an iPhone simulator or something but I've never tried before.

I think it's also possible to apply source maps after the fact, so I guess that's the next step if Mobile Safari doesn't support map files.

It's probably something minor given that this works on desktop Safari.

@IronOxidizer
Copy link
Contributor Author

IronOxidizer commented Apr 12, 2021

Took some time but I found a way to get console output on iOS (I don't own any Apple devices, this iPhone isn't even mine so debugging with broken tooling took hours).

DEBUG/api/?days=true: 200
ERRORReferenceError: Can't find variable: MediaSource
ERRORUncaught error: ReferenceError: Can't find variable: MediaSource [object Object]

if (!MediaSource.isTypeSupported(part.mimeType)) {

This seems like the only instance of MediaSource from what I can tell, and it seems like a well known issue. Seems like iOS doesn't support MSE so not sure how to handle this.

@scottlamb
Copy link
Owner

scottlamb commented Apr 12, 2021

Thanks for tracking that down!

Hopefully it isn't as bad as MSE not being supported at all. My whole approach for live view and the planned scrub bar UI depends on it.

https://caniuse.com/?search=mediasource is a bit more mixed. At the top for "Media Source Extensions" for "Safari on iOS" it says "~ Partial Support ... fully supported on iPadOS 13 and later." Kind of vague. Under "MediaSource API", "Safari on iOS" is in green. And I see now a subtest https://caniuse.com/mdn-api_mediasource_istypesupported which says MediaSource.isTypeSupported is supported. So does https://developer.mozilla.org/en-US/docs/Web/API/MediaSource/isTypeSupported . Maybe I'm holding it wrong?

@scottlamb
Copy link
Owner

Maybe all the stuff saying it's supported means only on iPads.😱

@scottlamb
Copy link
Owner

I guess the alternatives are HLS or WebRTC. There is a shiny new Rust library for WebRTC. Maybe that's the way to go. https://github.com/webrtc-rs/webrtc

@IronOxidizer
Copy link
Contributor Author

IronOxidizer commented Apr 12, 2021

One thing to note about going for a WebRTC approach is that we might need a different implementation depending on whether the client is on the same network or not. If the client is not on the same network, we would use the normal method of negotiating a connection via STUN servers. If the client is on the same network and we don't have internet connectivity (closed network), we would have to manually send SDP between the client and the server, probably with AJAX.

Maybe I'm over complicating it but this is an issue I previously had when working with WebRTC on either LAN or over the internet.

@scottlamb
Copy link
Owner

scottlamb commented Apr 12, 2021

One thing to note about going for a WebRTC approach is that we might need a different implementation depending on whether the client is on the same network or not.

Would we still need STUN if the server is Internet-accessible (not behind NAT or using port forwarding)? It'd be nice to avoid that complexity. I don't really know anything about WebRTC yet or have any experience with it.

I also took a quick look into the the HLS approach. It doesn't look too hard if we can use fragmented .mp4 files. We already have (extensive) logic for generating those. There's also a proof-of-concept Rust crate called lowly which is written by the same author as the h264-reader crate we're using. I don't think we'd use lowly directly for several reasons (eg it uses ffmpeg to generate its .mp4 files where we have our own logic) but it might be a place to look to for inspiration.

I'm still grieving the inability to use MSE on all platforms. That was how I'd planned to do the scrub bar stuff. WebCodecs would be even better but I know that won't be viable for a long time. I wonder if Apple will add support for MSE on iPhone soon, given that they apparently support it on iPad. It seems so strange they'd support it in one place but not the other.

@scottlamb scottlamb changed the title live view not working on iOS live view not working on iPhone Aug 13, 2021
@scottlamb
Copy link
Owner

Updating the title to reflect that I think this only happens on iPhone; iPads apparently do support MSE like desktop Safari.

The version I'm about to release will at least give a more helpful error message, directing to this issue.

I'm thinking now the best course of action might be to have the Javascript UI create an adapter between the current HTTP API and the HLS approach that iPhones apparently need. Then we could keep pushing live segments immediately over the WebSocket rather than switching to polling.

@hn
Copy link

hn commented Apr 7, 2024

I'm not much into these frontend things, but the new Managed Media Source might be an (sources say easy to implement) option to resolve this issue (available on iPhones iOS 17+). More info e.g. here or here.

@scottlamb
Copy link
Owner

scottlamb commented Apr 17, 2024

Thanks for the pointer! Some progress:

  1. It turns out there's an iPhone simulator available on macOS as part of Xcode, so I'm able to actually test this even though I don't have access to a physical iPhone.
  2. That webkit.org link had a key paragraph that I missed. I figured it out by trial and error with code samples instead, and just noticed it now, after the fact.

    Note that support for Managed Media Source is only available when an AirPlay source alternative is present, or remote playback is explicitly disabled.

  3. I'm able to switch from MediaSource to ManagedMediaSource on Safari/macOS (basically a no-op, as MediaSource was working fine), but it's still not working on iPhone. The problem is window.ManagedMediaSource.isTypeSupported('video/mp4; codecs="avc1.4D401E"') returns false, and even if I disable that check, I don't see the video actually show up. I'm not sure what the problem is. I wondered if it was the container format. I could add support to Moonfire for MPEG-TS, but window.ManagedMediaSource.isTypeSupported('video/mp2ts; codecs="avc1.4D401E"') also returns false, so there's probably no point.
image

scottlamb added a commit that referenced this issue Apr 17, 2024
@scottlamb
Copy link
Owner

On second thought, this might be a simulator problem. I wonder if window.ManagedMediaSource.isTypeSupported('video/mp4; codecs="avc1.4D401E"') would return true on a physical iPhone.

@hn
Copy link

hn commented Apr 17, 2024

The most complicated Javascript code I've ever written (because I've never written any JS before):

foo = window.ManagedMediaSource.isTypeSupported('video/mp4; codecs="avc1.4D401E"')
bar = window.ManagedMediaSource.isTypeSupported('video/mp4; codecs="nonsense"')

console.log(foo)
console.log(bar)

Returns foo=true and bar=false on an physical iPhone iOS 17.4 (and as expected some TypeError on desktop MS Edge)

@scottlamb
Copy link
Owner

That's encouraging! My change may be enough then.

@hn
Copy link

hn commented Apr 17, 2024

After downloading a zillion fancy build somethings I can say ... hooray, it (the iphone branch) works :)

I see all my cameras e.g. in 2x2 view, which is great. But ... they do not refresh (still images). If I select (change) a camera in one of the dropdowns the video comes to foreground and refreshes constantly (live stream). It switches to still image if I stop the foreground mode. Disclaimer: all based on a quick 5 minute test.

@scottlamb
Copy link
Owner

Progress! Is there anything useful written in the Javascript console?

@hn
Copy link

hn commented Apr 18, 2024

As far as I can see, there is no easily accessible javascript console on the iPhone. You have to connect the phone to a computer via cable and debug there. I can't say at the moment when I will have time to do this.

Pure guesswork: Possibly the behavior with the still images is not a bug, but the intention of the "Managed" Media Source so that no battery power is wasted. Perhaps it is possible to set some kind of priority flag, which will cause the videos to play live?

In any case, the current state of the code is already a good usable step forward.

@scottlamb
Copy link
Owner

As far as I can see, there is no easily accessible javascript console on the iPhone. You have to connect the phone to a computer via cable and debug there. I can't say at the moment when I will have time to do this.

I think you're right; it's a similar experience for Chrome/Android.

Pure guesswork: Possibly the behavior with the still images is not a bug, but the intention of the "Managed" Media Source so that no battery power is wasted. Perhaps it is possible to set some kind of priority flag, which will cause the videos to play live?

That doesn't match my understanding of how it's supposed to work. The startstreaming and endstreaming events are hints; you can totally ignore them.

This part of the Moonfire UI still says (experimental) because I know there are bugs/omissions in my handling in general. I've seen error messages in Firefox still; if you put your device to sleep then wake it up (or maybe even just tab away and back), you can end up with several seconds of stale video in the buffer; etc. So I wouldn't be surprised if I have more work to do. I'm just happy it seems to be fundamentally possible to make this API work on iPhone where it didn't before. Not having one API that would work on all devices really took the wind out of my sails in terms of developing a nice UI for Moonfire.

@IronOxidizer
Copy link
Contributor Author

Great to see progress being made on this!

As far as I can see, there is no easily accessible javascript console on the iPhone.

You can enable log collection using chrome://inspect as mentioned in
https://blog.chromium.org/2019/03/debugging-websites-in-chrome-for-ios.html

Enable JavaScript log collection by navigating to chrome://inspect in Chrome for iOS and leaving that tab open to collect logs. In another tab, reproduce the case for which you are interested. Then switch back to the chrome://inspect tab to view the collected logs. (Log collection will stop if the chrome://inspect page closes or navigates and logs will be lost as they are not persisted.)

@scottlamb
Copy link
Owner

I bought a used iPhone SE 3rd gen (2022) for developing Moonfire NVR and other projects.

I see all my cameras e.g. in 2x2 view, which is great. But ... they do not refresh (still images). If I select (change) a camera in one of the dropdowns the video comes to foreground and refreshes constantly (live stream). It switches to still image if I stop the foreground mode. Disclaimer: all based on a quick 5 minute test.

I can see exactly this behavior for myself now. And I think the solution is really simple: add a playsinline attribute, e.g. <video playsinline>.

The same full-screen behavior / auto-pause when you leave it seems to be present for the list view stuff. But it's less problematic there because that uses <video controls> and so you can use those controls to unpause it. The live view doesn't offer a way to do that, so without playsinline once you exit out of the full-screen thing, it just stalls until you switch to another camera.

I think that will bring iPhone up to parity with other devices (although hardly perfection yet).

@hn
Copy link

hn commented Aug 22, 2024

I am happy to confirm that the live-view here now also works as intended!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants