-
-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
作者有考虑兼容livePhoto的显示吗? #2297
Comments
如果我没有记错的话,从网络下载的 Live Photo 应该是分成静态 image (HEIC) 和一个视频 (MOV) 的,所以下载和缓存也许要分别处理。能分享一些你们使用 Live Photo 的实际场景么(比如你们的 Live Photo 的来源,以及server是如何下发这些资源的),这样可能可以对设计更合理的API有所帮助。 感谢。 |
非常感谢作者百忙之中回复。 再次非常感谢作者的回复,期待我们的kingfisher的下个版本更新。 |
依据 URL 或者甚至 response data 来决定内部使用的 view 这件事情,相对来说超出了 Kingfisher 的最初设计目标。 可能可以考虑为 您怎么看? |
当然..如果确实有需要在 Cell 里就混杂着显示 Live Photo 的话,也可以考虑通过向 server 请求 URL 的时候,让 server 的返回里带上对应图片是 Live Photo 的 metadata,这样就可以在创建 cell 时直接选择使用 |
我觉得这个方案非常不错,也很适合当前kingfisher的设计的一致性。 非常期待这个版本的到来,谢谢! |
这个问题都不大,只要我和服务端约定好协议,就可以区分了 |
@onevcat 作者您好! |
还有一些工作要做,最后的一部分重构 补充文档之类的。着急的话可以先用master试试,也欢迎反馈和意见。 |
所以现在已经是好了的样子? |
@onevcat 猫神你好。
开发环境是在iOS 18.0.1、Xcode 16。查到的原因可能是load下来的图片视频缺失metadata,我们本地额外处理一下这块,就可以正常播放,处理的逻辑如 https://juejin.cn/post/7222229682027610149?searchId=20241030174923E793C0448C99F5168EF8#heading-5 该文档中描述的 addIdentifier 的实现。. |
@HIIgor 看起来这张图片和对应的视频并不是严格按照 Apple 的方式导出的。 我尝试了一个非常简陋的实现(在这个 branch 可以找到。别用,从缓存加载时会 crash),但是发现如果都按照这个方式的话,看起来性能上退化会比较严重:在 iPhone 16 上视频需要1.6s左右,图片需要0.3秒左右进行处理。可能不太能无脑为所有图片/视频都进行添加。我看一下有没有什么办法能优化和检查一下。如果最后能做到不影响性能的话,会考虑添加到内置;但是如果没有特别好的方式的话,就只能给一个 delegate 让用户自行判断和添加了。 最后还是建议这部分 meta data 的处理,应该还是要预先做好(不论是在服务器还是在上传方的客户端)。在下载和显示端,还是希望能获取到可以直接显示的内容。还请参考。 |
针对于这个,我的做法是在下载的时候将逻辑进行转换,一开始也考虑过在上传的时候进行处理。但是后来因为某些原因,线上已经上去了一波 live 的图,如果要兼容的话,得重复写2遍逻辑,感觉后面维护有点麻烦,所以就直接在 Downloader 处理了。:
本来想在 Processor 里面处理,但是发现在 live 场景下 processor 被写死了,只能在往前一步了 |
@zkhCreator 感谢分享,情况了解了。可以问一下这么做的性能上的表现如何么?比如对于视频的 另外,其实可以考虑实现一下这个 delegate,https://github.com/onevcat/Kingfisher/blob/master/Sources/Networking/ImageDownloaderDelegate.swift#L75-L96 可能相比 subclass 一个 downloader,会更好看一些。 |
extension PHAssetResource: @unchecked @retroactive Sendable {}
extension AVAssetTrack: @unchecked @retroactive Sendable {}
extension PHLivePhoto: @unchecked @retroactive Sendable {}
extension AVAssetReader: @unchecked @retroactive Sendable {}
extension AVAssetWriterInput: @unchecked @retroactive Sendable {}
extension AVAssetReaderTrackOutput: @unchecked @retroactive Sendable {}
extension AVAssetWriter: @unchecked @retroactive Sendable {}
enum LivePhotosDisassembleError: Error {
case requestDataFailed
case noFilenameExtension
case noImageData
}
enum LivePhotosAssembleError: Error {
case addPhotoIdentifierFailed
case createDestinationImageFailed
case writingVideoFailed
case writingAudioFailed
case requestFailed
case loadTracksFailed
case loadMovieResultFailed
case noCachesDirectory
}
public actor LivePhotosUtils {
public static let sharedInstance = LivePhotosUtils()
public func isLivePhoto(item: PHPickerResult) async -> Bool {
return await item.itemProvider.isLive()
}
}
public struct LivePhotoParsedModel: Sendable {
public let photoUrl: URL
public let movieUrl: URL
public let image: UIImage
public init(photoUrl: URL, movieUrl: URL, image: UIImage) {
self.photoUrl = photoUrl
self.movieUrl = movieUrl
self.image = image
}
}
// MARK: - disassemble
public extension LivePhotosUtils {
// 图片解码
func disassemble(livePhoto: PHLivePhoto) async throws -> LivePhotoParsedModel {
let assetResources = PHAssetResource.assetResources(for: livePhoto)
let list = try await withThrowingTaskGroup(of: (PHAssetResource, Data).self) { taskGroup in
for assetResource in assetResources {
taskGroup.addTask {
return try await withCheckedThrowingContinuation { continuation in
let dataBuffer = NSMutableData()
let options = PHAssetResourceRequestOptions()
options.isNetworkAccessAllowed = true
PHAssetResourceManager.default().requestData(for: assetResource, options: options) { data in
dataBuffer.append(data)
} completionHandler: { error in
guard error == nil else {
continuation.resume(throwing: LivePhotosDisassembleError.requestDataFailed)
return
}
continuation.resume(returning: (assetResource, dataBuffer as Data))
}
}
}
}
var results: [(PHAssetResource, Data)] = []
for try await result in taskGroup {
results.append(result)
}
return results
}
guard var photo = (list.first { $0.0.type == .photo }),
let video = (list.first { $0.0.type == .pairedVideo }) else {
throw LivePhotosDisassembleError.requestDataFailed
}
let (imageData, image) = try compressImage(imageData: photo.1)
photo.1 = imageData
let cachesDirectory = cachesDirectory()
let photoURL = try save(photo.0, data: photo.1, to: cachesDirectory, fileExtension: "jpeg")
let videoURL = try save(video.0, data: video.1, to: cachesDirectory)
return LivePhotoParsedModel.init(photoUrl: photoURL, movieUrl: videoURL, image: image)
}
private func save(_ assetResource: PHAssetResource, data: Data, to url: URL, fileExtension: String? = nil) throws -> URL {
guard let ext = fileExtension ?? UTType(assetResource.uniformTypeIdentifier)?.preferredFilenameExtension else {
throw LivePhotosDisassembleError.noFilenameExtension
}
let destinationURL = url.appendingPathComponent(NSUUID().uuidString).appendingPathExtension(ext as String)
// Create the directory if it doesn't exist
try FileManager.default.createDirectory(at: destinationURL.deletingLastPathComponent(), withIntermediateDirectories: true, attributes: nil)
try data.write(to: destinationURL, options: [Data.WritingOptions.atomic])
return destinationURL
}
private func compressImage(imageData: Data) throws -> (Data, UIImage) {
guard let image = UIImage(data: imageData), let imageData = image.jpegData(compressionQuality: 0.9) else {
throw LivePhotosDisassembleError.noImageData
}
return (imageData, image)
}
}
// MARK: - Assemble
public extension LivePhotosUtils {
// 图片编码成 LivePhoto
func assemble(photoURL: URL, videoURL: URL, progress: ((Float) -> Void)? = nil) async throws -> (PHLivePhoto, (URL, URL)) {
let cacheDirectory = try assembleCachesDirectory()
let identifier = UUID().uuidString
let pairedPhotoURL = try addIdentifier(
identifier,
fromPhotoURL: photoURL,
to: cacheDirectory.appendingPathComponent(identifier).appendingPathExtension("jpg"))
let pairedVideoURL = try await addIdentifier(
identifier,
fromVideoURL: videoURL,
to: cacheDirectory.appendingPathComponent(identifier).appendingPathExtension("mov"),
progress: progress)
let livePhoto = try await combinePhotoLive(imageURL: pairedPhotoURL, movieURL: pairedVideoURL)
return (livePhoto, (pairedPhotoURL, pairedVideoURL))
}
func combinePhotoLive(imageURL: URL, movieURL: URL) async throws -> PHLivePhoto {
let livePhoto = try await withCheckedThrowingContinuation({ continuation in
PHLivePhoto.request(
withResourceFileURLs: [imageURL, movieURL],
placeholderImage: nil,
targetSize: .zero,
contentMode: .aspectFill) { livePhoto, info in
if let isDegraded = info[PHLivePhotoInfoIsDegradedKey] as? Bool, isDegraded {
return
}
if let livePhoto {
continuation.resume(returning: livePhoto)
} else {
continuation.resume(throwing: LivePhotosAssembleError.requestFailed)
}
}
})
return livePhoto
}
}
// MARK: --- Assemble Photo
extension LivePhotosUtils {
private func addIdentifier(_ identifier: String, fromPhotoURL photoURL: URL, to destinationURL: URL) throws -> URL {
let imageSource = CGImageSourceCreateWithURL(photoURL as CFURL, nil)
let destinationType = UTType.jpeg.identifier as CFString
let destination: CGImageDestination? = CGImageDestinationCreateWithURL(destinationURL as CFURL, destinationType, 1, nil)
try addIdentifierToImage(identifier: identifier, imageSource: imageSource, destination: destination)
return destinationURL
}
public func addIdentifier(_ identifier: String, toPhotoData photoData: Data) throws -> Data {
let imageSource = CGImageSourceCreateWithData(photoData as CFData, nil)
let destinationType = UTType.jpeg.identifier as CFString
let mutableData = NSMutableData()
let destination: CGImageDestination? = CGImageDestinationCreateWithData(mutableData, destinationType, 1, nil)
try addIdentifierToImage(identifier: identifier, imageSource: imageSource, destination: destination)
return mutableData as Data
}
private func addIdentifierToImage(identifier: String, imageSource: CGImageSource?, destination: CGImageDestination?) throws {
guard let imageSource = imageSource,
let imageRef = CGImageSourceCreateImageAtIndex(imageSource, 0, nil),
var imageProperties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, nil) as? [AnyHashable : Any],
let destination = destination else {
throw LivePhotosAssembleError.addPhotoIdentifierFailed
}
let identifierInfo = ["17" : identifier]
imageProperties[kCGImagePropertyMakerAppleDictionary] = identifierInfo
CGImageDestinationAddImage(destination, imageRef, imageProperties as CFDictionary)
if !CGImageDestinationFinalize(destination) {
throw LivePhotosAssembleError.createDestinationImageFailed
}
}
}
// MARK: --- Assemble Movie
extension LivePhotosUtils {
public func addIdentifier(
_ identifier: String,
toMovieData toData: Data,
progress: ((Float) -> Void)? = nil
) async throws -> Data {
// 创建资源
guard let tempUrl = TemplateFileStorage.save(data: toData, fileExtension: "mov") else {
throw LivePhotosAssembleError.writingVideoFailed
}
let toDataDestination = TemplateFileStorage.filePath(directoryPath: [], fileName: UUID().uuidString, pathExtension: "mov")
let result = try await addIdentifier(identifier, fromVideoURL: tempUrl, to: toDataDestination, progress: progress)
TemplateFileStorage.deleteFile(at: tempUrl)
guard let data = TemplateFileStorage.readFile(at: result) else {
TemplateFileStorage.deleteFile(at: result)
throw LivePhotosAssembleError.loadMovieResultFailed
}
TemplateFileStorage.deleteFile(at: result)
return data
}
private func addIdentifier(
_ identifier: String,
fromVideoURL videoURL: URL,
to destinationURL: URL,
progress: ((Float) -> Void)? = nil
) async throws -> URL {
let asset = AVURLAsset(url: videoURL)
// --- Reader ---
// Create the video reader
let videoReader = try AVAssetReader(asset: asset)
// Create the video reader output
guard let videoTrack = try await asset.loadTracks(withMediaType: .video).first else { throw LivePhotosAssembleError.loadTracksFailed }
let videoReaderOutputSettings : [String : Any] = [kCVPixelBufferPixelFormatTypeKey as String : kCVPixelFormatType_32BGRA]
let videoReaderOutput = AVAssetReaderTrackOutput(track: videoTrack, outputSettings: videoReaderOutputSettings)
// Add the video reader output to video reader
videoReader.add(videoReaderOutput)
// Create the audio reader
let audioReader = try AVAssetReader(asset: asset)
// Create the audio reader output
guard let audioTrack = try await asset.loadTracks(withMediaType: .audio).first else { throw LivePhotosAssembleError.loadTracksFailed }
let audioReaderOutput = AVAssetReaderTrackOutput(track: audioTrack, outputSettings: nil)
// Add the audio reader output to audioReader
audioReader.add(audioReaderOutput)
// --- Writer ---
// Create the asset writer
let assetWriter = try AVAssetWriter(outputURL: destinationURL, fileType: .mov)
// Create the video writer input
let videoWriterInputOutputSettings : [String : Any] = [
AVVideoCodecKey : AVVideoCodecType.h264,
AVVideoWidthKey : try await videoTrack.load(.naturalSize).width,
AVVideoHeightKey : try await videoTrack.load(.naturalSize).height]
let videoWriterInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoWriterInputOutputSettings)
videoWriterInput.transform = try await videoTrack.load(.preferredTransform)
videoWriterInput.expectsMediaDataInRealTime = true
// Add the video writer input to asset writer
assetWriter.add(videoWriterInput)
// Create the audio writer input
let audioWriterInput = AVAssetWriterInput(mediaType: .audio, outputSettings: nil)
audioWriterInput.expectsMediaDataInRealTime = false
// Add the audio writer input to asset writer
assetWriter.add(audioWriterInput)
// Create the identifier metadata
let identifierMetadata = metadataItem(for: identifier)
// Create still image time metadata track
let stillImageTimeMetadataAdaptor = stillImageTimeMetadataAdaptor()
assetWriter.metadata = [identifierMetadata]
assetWriter.add(stillImageTimeMetadataAdaptor.assetWriterInput)
// Start the asset writer
assetWriter.startWriting()
assetWriter.startSession(atSourceTime: .zero)
// Add still image metadata
let frameCount = try await asset.frameCount()
let stillImagePercent: Float = 0.5
await stillImageTimeMetadataAdaptor.append(
AVTimedMetadataGroup(
items: [stillImageTimeMetadataItem()],
timeRange: try asset.makeStillImageTimeRange(percent: stillImagePercent, inFrameCount: frameCount)))
async let writingVideoFinished: Bool = withCheckedThrowingContinuation { continuation in
Task {
videoReader.startReading()
var currentFrameCount = 0
videoWriterInput.requestMediaDataWhenReady(on: DispatchQueue(label: "videoWriterInputQueue")) {
while videoWriterInput.isReadyForMoreMediaData {
if let sampleBuffer = videoReaderOutput.copyNextSampleBuffer() {
currentFrameCount += 1
if let progress {
let progressValue = min(Float(currentFrameCount)/Float(frameCount), 1.0)
Task { @MainActor in
progress(progressValue)
}
}
if !videoWriterInput.append(sampleBuffer) {
videoReader.cancelReading()
continuation.resume(throwing: LivePhotosAssembleError.writingVideoFailed)
return
}
} else {
videoWriterInput.markAsFinished()
continuation.resume(returning: true)
return
}
}
}
}
}
async let writingAudioFinished: Bool = withCheckedThrowingContinuation { continuation in
Task {
audioReader.startReading()
audioWriterInput.requestMediaDataWhenReady(on: DispatchQueue(label: "audioWriterInputQueue")) {
while audioWriterInput.isReadyForMoreMediaData {
if let sampleBuffer = audioReaderOutput.copyNextSampleBuffer() {
if !audioWriterInput.append(sampleBuffer) {
audioReader.cancelReading()
continuation.resume(throwing: LivePhotosAssembleError.writingAudioFailed)
return
}
} else {
audioWriterInput.markAsFinished()
continuation.resume(returning: true)
return
}
}
}
}
}
await (_, _) = try (writingVideoFinished, writingAudioFinished)
await assetWriter.finishWriting()
return destinationURL
}
private func metadataItem(for identifier: String) -> AVMetadataItem {
let item = AVMutableMetadataItem()
item.keySpace = AVMetadataKeySpace.quickTimeMetadata // "mdta"
item.dataType = "com.apple.metadata.datatype.UTF-8"
item.key = AVMetadataKey.quickTimeMetadataKeyContentIdentifier as any NSCopying & NSObjectProtocol // "com.apple.quicktime.content.identifier"
item.value = identifier as any NSCopying & NSObjectProtocol
return item
}
private func stillImageTimeMetadataAdaptor() -> AVAssetWriterInputMetadataAdaptor {
let quickTimeMetadataKeySpace = AVMetadataKeySpace.quickTimeMetadata.rawValue // "mdta"
let stillImageTimeKey = "com.apple.quicktime.still-image-time"
let spec: [NSString : Any] = [
kCMMetadataFormatDescriptionMetadataSpecificationKey_Identifier as NSString : "\(quickTimeMetadataKeySpace)/\(stillImageTimeKey)",
kCMMetadataFormatDescriptionMetadataSpecificationKey_DataType as NSString : kCMMetadataBaseDataType_SInt8]
var desc : CMFormatDescription? = nil
CMMetadataFormatDescriptionCreateWithMetadataSpecifications(
allocator: kCFAllocatorDefault,
metadataType: kCMMetadataFormatType_Boxed,
metadataSpecifications: [spec] as CFArray,
formatDescriptionOut: &desc)
let input = AVAssetWriterInput(
mediaType: .metadata,
outputSettings: nil,
sourceFormatHint: desc)
return AVAssetWriterInputMetadataAdaptor(assetWriterInput: input)
}
private func stillImageTimeMetadataItem() -> AVMetadataItem {
let item = AVMutableMetadataItem()
item.key = "com.apple.quicktime.still-image-time" as any NSCopying & NSObjectProtocol
item.keySpace = AVMetadataKeySpace.quickTimeMetadata // "mdta"
item.value = 0 as any NSCopying & NSObjectProtocol
item.dataType = kCMMetadataBaseDataType_SInt8 as String // "com.apple.metadata.datatype.int8"
return item
}
}
extension LivePhotosUtils {
private func cachesDirectory() -> URL {
return TemplateFileStorage.fileDirectory(directoryPath: ["livePhotos"])
}
private func assembleCachesDirectory() throws -> URL {
if let cachesDirectoryURL = try? FileManager.default.url(for: .cachesDirectory, in: .userDomainMask, appropriateFor: nil, create: false) {
let cachesDirectory = cachesDirectoryURL.appendingPathComponent("livePhotos", isDirectory: true)
if !FileManager.default.fileExists(atPath: cachesDirectory.absoluteString) {
try? FileManager.default.createDirectory(at: cachesDirectory, withIntermediateDirectories: true, attributes: nil)
}
return cachesDirectory
}
throw LivePhotosAssembleError.noCachesDirectory
}
public func clearAssembleCachesDirectory() {
do {
let cachesDirectory = try assembleCachesDirectory()
let fileManager = FileManager.default
let fileURLs = try fileManager.contentsOfDirectory(at: cachesDirectory, includingPropertiesForKeys: nil, options: .skipsHiddenFiles)
for fileURL in fileURLs {
try fileManager.removeItem(at: fileURL)
}
print("Successfully cleared the assembleCachesDirectory")
} catch {
print("Error clearing assembleCachesDirectory: \(error)")
}
}
} 直接贴源码了,里面 |
性能层面上没有很细致得查过,因为没有一些体感上的卡顿,就没管他了。 |
感谢感谢。我细查一下,如果能加的话,可能后续考虑为这个问题加一个 option,这样就不需要大家去自己实现一遍了。 |
👌~ |
LivePhoto有点特殊,首先它完全不走 memory cache,所以你不能通过这个方法去获取。你需要memory获取的话,可以从 result 里拿一个 PHLivePhoto 的引用。 另外,如果是想从 disk 中拿源图片和视频的话,因为 PhotoKit 的关系,在 disk 存储的时候必须带 extension,需要强制指定一下。 |
Check List
Thanks for considering to open an issue. Before you submit your issue, please confirm these boxes are checked.
Issue Description
作者有考虑兼容livePhoto的显示吗?自己去处理LivePhoto的缓存与显示的问题非常复杂,自己有点力不从心,希望作者能考虑下,谢谢
What
[Tell us about the issue]
Reproduce
[The steps to reproduce this issue. What is the url you were trying to load, where did you put your code, etc.]
Other Comment
[Add anything else here]
The text was updated successfully, but these errors were encountered: