#StackBounty: #swift #swift3 Get playback milliseconds

Bounty: 50

Already achieved most of the work hopefully but still struggling in this part.

I want to get playback milliseconds. With this code. i get the duration and the current playing time. But i also want the milliseconds as a part of my subtitle application.

Okay so we start with the seconds. this code gives me what i want:

func getHoursMinutesSecondsFrom(seconds: Double) -> (hours: Int, minutes: Int, seconds: Int) {
        let secs = Int(seconds)
        let hours = secs / 3600
        let minutes = (secs % 3600) / 60
        let seconds = (secs % 3600) % 60

        // return the millisecond here aswell?

        return (hours, minutes, seconds)
}

Now with this function. we format the time and calculate:

func formatTimeFor(seconds: Double) -> String {
    let result = getHoursMinutesSecondsFrom(seconds: seconds)
    let hoursString = "(result.hours)"
    var minutesString = "(result.minutes)"
    if minutesString.characters.count == 1 {
        minutesString = "0(result.minutes)"
    }
    var secondsString = "(result.seconds)"
    if secondsString.characters.count == 1 {
        secondsString = "0(result.seconds)"
    }
    var time = "(hoursString):"
    if result.hours >= 1 {
        time.append("(minutesString):(secondsString)")
    }
    else {
        time = "(minutesString):(secondsString)"
    }
    return time
}

Now calling this function gets us what we want:

 func updateTime() {
    // Access current item
    if let currentItem = player?.currentItem {
        // Get the current time in seconds
        let playhead = currentItem.currentTime().seconds
        let duration = currentItem.duration.seconds
        // Format seconds for human readable string
        self.statics.text = "Current time: (formatTimeFor(seconds: playhead)) ---> Full: (formatTimeFor(seconds: duration))"

    }
}

Thanks. your help is truly appreciated

Update

The current way of checking for my playback duration is to have a timer that hits a function every 0.5 seconds. like this:

func scheduledTimerWithTimeInterval(){
    // Scheduling timer to Call the function "updateCounting" with the interval of 1 seconds
    timer = Timer.scheduledTimer(timeInterval: 0.5, target: self, selector: #selector(self.updateCounting), userInfo: nil, repeats: true)
}

Side note. I do get the milliseconds in a way:

let millisecondsInt =  (seconds.truncatingRemainder(dividingBy: 1) * 1000)

The problem is. at every 0.5 seconds the update_time() function is hit. that gives me corrupt info. for example. if i wanted to put a subtitle at: 00:16:267 there’s a 10% chance the code will run cause the numbers are random.

I assume that this code is somehow not correct, I’d like to have the counter do the job this way:

Current time: 00:14:266 ---> Full: 04:12:610
Current time: 00:14:267 ---> Full: 04:12:610 // 267 milliseconds +1
Current time: 00:14:268 ---> Full: 04:12:610
Current time: 00:14:269 ---> Full: 04:12:610
Current time: 00:14:270 ---> Full: 04:12:610
Current time: 00:14:271 ---> Full: 04:12:610
Current time: 00:14:272 ---> Full: 04:12:610

Not this random number jumping way:

Current time: 00:13:767 ---> Full: 04:12:610
Current time: 00:14:267 ---> Full: 04:12:610
Current time: 00:14:767 ---> Full: 04:12:610
Current time: 00:15:266 ---> Full: 04:12:610
Current time: 00:15:767 ---> Full: 04:12:610
Current time: 00:16:267 ---> Full: 04:12:610
Current time: 00:16:767 ---> Full: 04:12:610


Get this bounty!!!

#StackBounty: #ios #objective-c #swift #xcode #bridging-header Xcode 10 Failed to emit precompiled header for bridging header

Bounty: 50

Hi I’m getting a very annoying error on Xcode 10 that is

1 error generated.
<unknown>:0: error: failed to emit precompiled header '/Users/me/Library/Developer/Xcode/DerivedData/APP-hlczpckeselwrtaqjcbxdpoiogkj/Build/Intermediates.noindex/PrecompiledHeaders/APP-Bridging-Header-swift_35K3KO8G70VCD-clang_3DGF15CYP79L0.pch' for bridging header '/Users/me/Desktop/Swift/Folder/APP/APP/UNLKV2-Bridging-Header.h'

I’m not sure how to solve this I have tried everything. It was very sudden today before there was no problem but the second I added the Firebase info.plist to the project I got this error. I have tried to delete the file, made sure the name of the header file was entered correctly in the “Objective-C Bridging Header” and I have it entered as
$(PROJECT_DIR)/$(PROJECT_NAME)/$(PROJECT_NAME)-Bridging-Header.h
I have cleaned and built the project multiple times and reinstalled all the pods.

Besides this, I also get an error that

JPSVolumeButtonHandler/JPSVolumeButtonHandler.h’ file not found

I would really appreciate it if someone could help me solve this, I’ve been looking around all day at every single post and forum and nothing has worked for me. If you need any more information about this issue, please let me know.


Get this bounty!!!

#StackBounty: #swift #calayer #cabasicanimation #avassetexportsession #avvideocomposition Can't show animated CALayer in video usin…

Bounty: 50

UPDATE 6:
I’ve managed to fix my issue completely but I still would like a better explanation than what I’m guessing is the reason it didn’t work if I’m incorrect

I’ve been trying to animate a sprite sheet over a video but every time I export the video the end result is the sample video I start with.

Here’s my code:

First up my custom CALayer to handle my own sprite sheets

class SpriteLayer: CALayer {
    var frameIndex: Int

    override init() {
        // Using 0 as a default state
        self.frameIndex = 0
        super.init()
    }

    required init?(coder aDecoder: NSCoder) {
        self.frameIndex = 0
        super.init(coder: aDecoder)
    }

    override func display() {
        let currentFrameIndex = self.frameIndex
        if currentFrameIndex == 0 {
            return
        }
        let frameSize = self.contentsRect.size
        self.contentsRect = CGRect(x: 0, y: CGFloat(currentFrameIndex - 1) * frameSize.height, width: frameSize.width, height: frameSize.height)
    }

    override func action(forKey event: String) -> CAAction? {
        if event == "contentsRect" {
            return nil
        }
        return super.action(forKey: event)
    }

    override class func needsDisplay(forKey key: String) -> Bool {
        return key == "frameIndex"
    }
}

Gif is a basic class with nothing fancy and works just fine. gif.Strip is a UIImage of a vertical sprite sheet representing the gif.

Now comes the method that should export a new video (it is part of a larger class used for exporting.

func convertAndExport(to url :URL , completion: @escaping () -> Void ) {
        // Get Initial info and make sure our destination is available
        self.outputURL = url
        let stripCgImage = self.gif.strip!.cgImage!
        // This is used to time how long the export took
        let start = DispatchTime.now()
        do {
            try FileManager.default.removeItem(at: outputURL)
        } catch {
            print("Remove Error: (error.localizedDescription)")
            print(error)
        }
        // Find and load "sample.mp4" as a AVAsset
        let videoPath = Bundle.main.path(forResource: "sample", ofType: "mp4")!
        let videoUrl = URL(fileURLWithPath: videoPath)
        let videoAsset = AVAsset(url: videoUrl)
        // Start a new mutable Composition with the same base video track
        let mixComposition = AVMutableComposition()
        let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)!
        let clipVideoTrack = videoAsset.tracks(withMediaType: .video).first!
        do {
            try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: clipVideoTrack, at: kCMTimeZero)
        } catch {
            print("Insert Error: (error.localizedDescription)")
            print(error)
            return
        }
        compositionVideoTrack.preferredTransform = clipVideoTrack.preferredTransform
        // Quick access to the video size
        let videoSize = clipVideoTrack.naturalSize
        // Setup CALayer and it's animation
        let aLayer = SpriteLayer()
        aLayer.contents = stripCgImage
        aLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
        aLayer.opacity = 1.0
        aLayer.masksToBounds = true
        aLayer.bounds = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
        aLayer.contentsRect = CGRect(x: 0, y: 0, width: 1, height: 1.0 / 3.0)
        let spriteAnimation = CABasicAnimation(keyPath: "frameIndex")
        spriteAnimation.fromValue = 1
        spriteAnimation.toValue = 4
        spriteAnimation.duration = 2.25
        spriteAnimation.repeatCount = .infinity
        spriteAnimation.autoreverses = false
        spriteAnimation.beginTime = AVCoreAnimationBeginTimeAtZero
        aLayer.add(spriteAnimation, forKey: nil)
        // Setup Layers for AVVideoCompositionCoreAnimationTool
        let parentLayer = CALayer()
        let videoLayer = CALayer()
        parentLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
        videoLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
        parentLayer.addSublayer(videoLayer)
        parentLayer.addSublayer(aLayer)
        // Create the mutable video composition
        let videoComp = AVMutableVideoComposition()
        videoComp.renderSize = videoSize
        videoComp.frameDuration = CMTimeMake(1, 30)
        videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)
        // Set the video composition to apply to the composition's video track
        let instruction = AVMutableVideoCompositionInstruction()
        instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)
        let videoTrack = mixComposition.tracks(withMediaType: .video).first!
        let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
        instruction.layerInstructions = [layerInstruction]
        videoComp.instructions = [instruction]
        // Initialize export session
        let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough)!
        assetExport.videoComposition = videoComp
        assetExport.outputFileType = AVFileType.mp4
        assetExport.outputURL = self.outputURL
        assetExport.shouldOptimizeForNetworkUse = true
        // Export
        assetExport.exportAsynchronously {
            let status = assetExport.status
            switch status {
            case .failed:
                print("Export Failed")
                print("Export Error: (assetExport.error!.localizedDescription)")
                print(assetExport.error!)
            case .unknown:
                print("Export Unknown")
            case .exporting:
                print("Export Exporting")
            case .waiting:
                print("Export Waiting")
            case .cancelled:
                print("Export Cancelled")
            case .completed:
                let end = DispatchTime.now()
                let nanoTime = end.uptimeNanoseconds - start.uptimeNanoseconds
                let timeInterval = Double(nanoTime) / 1_000_000_000
                // Function is now over, we can print how long it took
                print("Time to generate video: (timeInterval) seconds")
                completion()
            }
        }
}

EDIT:
I based my code on the following links

UPDATE 1:
I’ve tried removing the CABasicAnimation part of my code and played around with my CALayer but to no avail. I can’t even get the image to show up.
To test things out I tried animating this sprite sheet using a CAKeyframeAnimation on contentsRect in a Xcode Playground and it worked fine so I don’t think the issue is with the CABasicAnimation, and maybe not even with the CALayer itself. I could really use some help on this because I don’t understand why I can’t even get an image to show over my sample video on the export.

UPDATE 2:
In response to matt’s comment I’ve tried forgetting about the sprite sheet for a bit and changed it into a CATextLayer but still not seeing anything on my video (it has dark images so white text should be perfectly visible)

let aLayer = CATextLayer()
aLayer.string = "This is a test"
aLayer.fontSize = videoSize.height / 6
aLayer.alignmentMode = kCAAlignmentCenter
aLayer.foregroundColor = UIColor.white.cgColor
aLayer.bounds = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height / 6)

UPDATE 3:
As per Matt’s request I tried changing parentLayer.addSublayer(aLayer) to videoLayer.addSublayer(aLayer) but still nothing changed, but I thought as much because the documentation for the AVVideoCompositionCoreAnimationTool is as follows

convenience init(postProcessingAsVideoLayer videoLayer: CALayer, 
              in animationLayer: CALayer)

meaning my parentLayer is it’s animationLayer and probably means any animations should be done in this layer.

UPDATE 4:
I’m starting to go crazy over here, I’ve given up for now the idea of showing text or an animated image I just want to affect my video in any way possible so I changed aLayer to this:

let aLayer = CALayer()
aLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
aLayer.backgroundColor = UIColor.white.cgColor

Well, this does absolutely nothing, I still get my sample video at my outputUrl (I started testing this in a playground with the following code if you want to “play” along)

import PlaygroundSupport
import UIKit
import Foundation
import AVFoundation

func convertAndExport(to url :URL , completion: @escaping () -> Void ) {
    let start = DispatchTime.now()
    do {
        try FileManager.default.removeItem(at: url)
    } catch {
        print("Remove Error: (error.localizedDescription)")
        print(error)
    }

    let videoPath = Bundle.main.path(forResource: "sample", ofType: "mp4")!
    let videoUrl = URL(fileURLWithPath: videoPath)
    let videoAsset = AVURLAsset(url: videoUrl)
    let mixComposition = AVMutableComposition()
    let compositionVideoTrack = mixComposition.addMutableTrack(withMediaType: .video, preferredTrackID: kCMPersistentTrackID_Invalid)!
    let clipVideoTrack = videoAsset.tracks(withMediaType: .video).first!

    do {
        try compositionVideoTrack.insertTimeRange(CMTimeRangeMake(kCMTimeZero, videoAsset.duration), of: clipVideoTrack, at: kCMTimeZero)
    } catch {
        print("Insert Error: (error.localizedDescription)")
        print(error)
        return
    }
    compositionVideoTrack.preferredTransform = clipVideoTrack.preferredTransform
    let videoSize = clipVideoTrack.naturalSize
    print("Video Size Detected: (videoSize.width) x (videoSize.height)")

    let aLayer = CALayer()
    aLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
    aLayer.backgroundColor = UIColor.white.cgColor

    let parentLayer = CALayer()
    let videoLayer = CALayer()
    parentLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
    videoLayer.frame = CGRect(x: 0, y: 0, width: videoSize.width, height: videoSize.height)
    parentLayer.addSublayer(videoLayer)
    parentLayer.addSublayer(aLayer)
    aLayer.setNeedsDisplay()
    let videoComp = AVMutableVideoComposition()
    videoComp.renderSize = videoSize
    videoComp.frameDuration = CMTimeMake(1, 30)
    videoComp.animationTool = AVVideoCompositionCoreAnimationTool(postProcessingAsVideoLayer: videoLayer, in: parentLayer)

    let instruction = AVMutableVideoCompositionInstruction()
    instruction.timeRange = CMTimeRangeMake(kCMTimeZero, mixComposition.duration)
    let videoTrack = mixComposition.tracks(withMediaType: .video).first!
    let layerInstruction = AVMutableVideoCompositionLayerInstruction(assetTrack: videoTrack)
    instruction.layerInstructions = [layerInstruction]
    videoComp.instructions = [instruction]

    let assetExport = AVAssetExportSession(asset: mixComposition, presetName: AVAssetExportPresetPassthrough)!
    assetExport.videoComposition = videoComp
    assetExport.outputFileType = AVFileType.mp4
    assetExport.outputURL = url
    assetExport.shouldOptimizeForNetworkUse = true

    assetExport.exportAsynchronously {
        let status = assetExport.status
        switch status {
        case .failed:
            print("Export Failed")
            print("Export Error: (assetExport.error!.localizedDescription)")
            print(assetExport.error!)
        case .unknown:
            print("Export Unknown")
        case .exporting:
            print("Export Exporting")
        case .waiting:
            print("Export Waiting")
        case .cancelled:
            print("Export Cancelled")
        case .completed:
            let end = DispatchTime.now()
            let nanoTime = end.uptimeNanoseconds - start.uptimeNanoseconds
            let timeInterval = Double(nanoTime) / 1_000_000_000
            print("Time to generate video: (timeInterval) seconds")
            completion()
        }
    }
}

let outputUrl = FileManager.default.temporaryDirectory.appendingPathComponent("test.mp4")
convertAndExport(to: outputUrl) {
    print(outputUrl)
}

Please someone help me understand what I’m doing wrong…

UPDATE 5:
I am running everything except playground tests from an iPad Air 2 (so no simulator) because I use the camera to take pictures and then stitch them into a sprite sheet I then planned on animating on a video I would send by email. I started doing Playground testing because every test from the iPad required me to go through the whole app cycle (countdown, photos, form, email sending/receiving)


Get this bounty!!!

#StackBounty: #ios #objective-c #swift #amazon-web-services #aws-appsync-ios Process for uploading image to s3 with appsync || iOS Apps…

Bounty: 50

I’m working on a new project that requires uploading attachments in the form of images. I’m using DynamoDB and AppSync API’s to insert and retrieve data from database. As we are new to the AppSync and all the amazon services and database we are using for the app i’m little bit confused about the authentication process. Right now we are using API key for authentication and I have tried these steps to upload image to s3.

1 Configue the AWSServiceManager with static configuration like :-

let staticCredit =  AWSStaticCredentialsProvider(accessKey: kAppSyncAccessKey, secretKey: kAppSyncSecretKey)
let AppSyncRegion: AWSRegionType = .USEast2
let config = AWSServiceConfiguration(region: AppSyncRegion, credentialsProvider: staticCredit)
AWSServiceManager.default().defaultServiceConfiguration = config

2 Uploding picture with this method : –

func updatePictureToServer(url:URL, completion:@escaping (Bool)->Void){
    let transferManager = AWSS3TransferManager.default()
    let uploadingFileURL = url
    let uploadRequest = AWSS3TransferManagerUploadRequest()
    let userBucket = String(format: "BUCKET")
    uploadRequest?.bucket = userBucket
    let fileName = String(format: "%@%@", AppSettings.getUserId(),".jpg")
    uploadRequest?.key = fileName
    uploadRequest?.body = uploadingFileURL
    transferManager.upload(uploadRequest!).continueWith(executor: AWSExecutor.mainThread(), block: { (task:AWSTask<AnyObject>) -> Any? in
        if let error = task.error as NSError? {
            if error.domain == AWSS3TransferManagerErrorDomain, let code = AWSS3TransferManagerErrorType(rawValue: error.code) {
                switch code {
                case .cancelled, .paused:
                    break
                default:
                    print("Error uploading: (String(describing: uploadRequest!.key)) Error: (error)")
                }
            } else {
                print("Error uploading: (String(describing: uploadRequest!.key)) Error: (error)")
            }
            completion(false)
            return nil
        }

        _ = task.result
        completion(true)
        print("Upload complete for: (String(describing: uploadRequest!.key))")
        return nil
    })
}

3 And finally i’m able to see the uploaded image on the S3 bucket

enter image description here

But i’m concerned about how to save the url of the image and how to retrieve the image because when i have to make the buket PUBLIC to retrieve the image and i don’t think that’s a good approach, plus is it necessary to have a Cognito user pool because we aren’t using Cognito user pool yet in our app and not have much knowledge about that too and documents are not helping in practical situations because we are implementing ti for the first time so we need some little help.

So two question : –

  1. Proper procedure to use for uploading and retrieving images for S3 and AppSync.
  2. Is it necessary to use Cognito user pool for image uploading and retrieving.

Thanks

Note: Any suggestion or improvement or anything related to the AppSync, S3 or DynamoDB will be truly appreciated and language is not a barrier just looking for directions so swift or objective-c no problem.


Get this bounty!!!

#StackBounty: #ios #swift #usernotifications Usernotification framework badge does not increase

Bounty: 50

I am using userNotification framework in my app and I want to set the badge to the number of notification received so what I did was to set the number of notification received into a userdefaults then I tried to assign the value to the badge to get me a badge number but the badge number would not increase. This is my code below

To set value of received notification

center.getDeliveredNotifications { (notification) in

    UserDefaults.standard.set(notification.count, forKey: Constants.NOTIFICATION_COUNT)
    print("notification.count (notification.count)")
    print(".count noti (UserDefaults.standard.integer(forKey: Constants.NOTIFICATION_COUNT))")

}

This accurately prints the number of notification recieved and when I decided to set it to my badge it only shows 1

content.badge = NSNumber(value: UserDefaults.standard.integer(forKey: Constants.NOTIFICATION_COUNT))

I have no idea why the value does not increase every time. Any help would be appreciated.

Or if it is possible to always update the badge anywhere in the app.


Get this bounty!!!

#StackBounty: #ios #iphone #swift #xcode #ipad Changing alternate icon for iPad

Bounty: 150

I have a problem with changing app’s icon on iPad. Everything is working fine on iPhone but on iPad I get this error :

[default] Failed to set preferredIconName to AI-Gorgosaurus for
…:0> error: Error
Domain=NSCocoaErrorDomain Code=4 “The file doesn’t exist.”
UserInfo={NSUnderlyingError=0x600000248a30 {Error
Domain=LSApplicationWorkspaceErrorDomain Code=-105 “iconName not found
in CFBundleAlternateIcons entry”
UserInfo={NSLocalizedDescription=iconName not found in
CFBundleAlternateIcons entry}}} App icon failed to due to The file
doesn’t exist.

I searched ad found that I need to add ~ipad in CFBundleIconFiles but still get the same error!.

Here is the code:

  func changeIcon(to name: String?) {
        //Check if the app supports alternating icons
        guard UIApplication.shared.supportsAlternateIcons else {
            return;
        }

        //Change the icon to a specific image with given name
        UIApplication.shared.setAlternateIconName(name) { (error) in
            //After app icon changed, print our error or success message
            if let error = error {
                print("App icon failed to due to (error.localizedDescription)")
            } else {
                print("App icon changed successfully.")
            }
        }
    }

enter image description here

I just tested on another project and works fine !!! but not on my current project why ?! have you any idea?


Get this bounty!!!

#StackBounty: #swift #opengl-es #scenekit #vuforia #metal How to combine SCNRenderer with an existing MTLCommandBuffer

Bounty: 50

I successfully integrated the Vuforia SDK Image Target Tracking feature into an iOS project by combining the OpenGL context (EAGLContext) that the SDK provides, with an instance of SceneKit’s SCNRenderer. That allowed me to leverage the simplicity of the SceneKit’s 3D API and at the same time benefiting from Vuforia’s high precision image detection. Now, I’d like to do the same by replacing OpenGL with Metal.

Some background story

I was able to draw SceneKit objects on top of the live video texture drawn by Vuforia using OpenGL without major problems.

Here’s the simplified setup I used with OpenGL:

func configureRenderer(for context: EAGLContext) {
    self.renderer = SCNRenderer(context: context, options: nil)
    self.scene = SCNScene()
    renderer.scene = scene

    // other scenekit setup
}

func render() {
    // manipulate scenekit nodes

    renderer.render(atTime: CFAbsoluteTimeGetCurrent())
}

Apple deprecates Metal on iOS

Since Apple announced that it is deprecating OpenGL on iOS 12, I figured it would be a good idea to try to migrate this project to use the Metal instead of OpenGL.

That should be simple in theory as Vuforia supports Metal out of the box. However, when trying to integrate it, I hit the wall.

The question

The view seems to ever only render results of the SceneKit renderer, or the textures encoded by Vuforia, but never both at the same time. It depends what is encoded first. What do I have to do to blend both results togeter?

Here’s the problematic setup in a nutshell:

func configureRenderer(for device: MTLDevice) {
    let renderer = SCNRenderer(device: device, options: nil)
    self.scene = SCNScene()
    renderer.scene = scene

    // other scenekit setup
}

 func render(viewport: CGRect, commandBuffer: MTLCommandBuffer, drawable: CAMetalDrawable) {
    // manipulate scenekit nodes

    let renderPassDescriptor = MTLRenderPassDescriptor()
    renderPassDescriptor.colorAttachments[0].texture = drawable.texture
    renderPassDescriptor.colorAttachments[0].loadAction = .load
    renderPassDescriptor.colorAttachments[0].storeAction = .store
    renderPassDescriptor.colorAttachments[0].clearColor = MTLClearColor(red: 0.0, green: 0, blue: 0, alpha: 0)

    renderer!.render(withViewport: viewport, commandBuffer: commandBuffer, passDescriptor: renderPassDescriptor)
 }

I tried calling render either after encoder.endEncoding or before the commandBuffer.renderCommandEncoderWithDescriptor :

metalDevice = MTLCreateSystemDefaultDevice();
metalCommandQueue = [metalDevice newCommandQueue];
id<MTLCommandBuffer>commandBuffer = [metalCommandQueue commandBuffer];

//// -----> call the `render(viewport:commandBuffer:drawable) here <------- \\

id<MTLRenderCommandEncoder> encoder = [commandBuffer renderCommandEncoderWithDescriptor:renderPassDescriptor];

// calls to encoder to render textures from Vuforia

[encoder endEncoding];

//// -----> or here <------- \\

[commandBuffer presentDrawable:drawable];
[commandBuffer commit];

In either case, I only see results of SCNRenderer OR results of the encoder, but never both in the same view.

It seems to me as if the encoding pass above, and the SCNRenderer.render, are overwriting each other’s buffers.

What am I missing here?


Get this bounty!!!

#StackBounty: #ios #swift #uitableview #uiwebview How to optimize loading large and varied HTML content in UIWebView inside multiple Ta…

Bounty: 50

I have a UITableViewCell that has a UIWebView inside it.
Data I load from the server has several comments containing rich HTML content(text, images, links, etc.) As the cell contains UIWebView, I have to wait till the entire content is loaded to get the height of the tableViewCell and update it.

The issue I’m running into is that as I scroll down the cells get updated and the whole view jerks and flickers. Basically, I’m looking for a way to preload the webviews to get their height beforehand. Also, this jerking only happens until all the cells have loaded the content since I’m storing the height of each cell locally.

Here’s an example of the jittery experience.

Code of cellForRowIndex

open func tableView(_ tableView: UITableView, cellForRowAt indexPath: IndexPath) -> UITableViewCell {
    let cell = tableView.dequeueReusableCell(indexPath: indexPath) as LiMessageDetailTableViewCell
    cell.delegate = self
    cell.cellModel = LiMessageDetailTableViewCellModel(data: messageObject.originalMessage, index: indexPath)
    cell.heightOfWebView = messageObject.contentHeightsOriginalMessage[indexPath.row]
    if messageObject.contentHeightsOriginalMessage[indexPath.row] != 0.0 {
        cell.updateHeightConstrain()
    }
    return cell
}

WebView delegate method in TableViewCell

 //MARK:- WEBVIEW DELEGATE
    func webViewDidFinishLoad(_ webView: UIWebView) {
     //This section resizes the webview according to the new content loaded.
        var mframe = webView.frame
        mframe.size.height = 1
        webView.frame = mframe
        let fittingSize = webView.sizeThatFits(.zero)
        mframe.size = fittingSize
        webView.frame = mframe
    /**
    I found that javascript gives a more accurate height when images are
    included since they take more time loading than the normal content and 
    hence sizeThatFits does not always return proper result.
    **/
        let heightInString = webView.stringByEvaluatingJavaScript(from: "document.body.scrollHeight") ?? ""
        guard let heightInFloat = Float(heightInString) else { return }
        guard let index = cellModel?.indexPath else {return}
        constrainHeightOfWebView.constant = fittingSize.height
        guard let cellType = cellModel?.messageType  else { return }
        delegate?.updateHeight(index: index, newHeight: CGFloat(heightInFloat), cellType: cellType)
    }

Delgate method which updates the height

func updateHeight(index: IndexPath, newHeight: CGFloat, cellType: LiMessageType) {
    switch cellType {
    case .originalMessage:
        if msgObj.contentHeightsOriginalMessage[index.row] != 0.0 && msgObj.contentHeightsOriginalMessage[index.row] == newHeight {
            return
        }
        msgObj.contentHeightsOriginalMessage[index.row] = newHeight
   .....
    }
    self.tableView.beginUpdates()
    self.tableView.endUpdates()
}

Another issue is that if the webview contains heavy images then the whole webview flickers as it reloads everytime the cell is dequeued, causing a bad experience.

Here’s the example of webview with image reloading

The image issue resolves itself after some time as webview gets cached, but nevertheless a bad experience.

Is there a way to solve these issues?


Get this bounty!!!

#StackBounty: #ios #swift #uitextfield Cursor goes to end after setting text to UITextField inside shouldChangeCharactersIn

Bounty: 50

I have text label which has phone number in it. I mask the phone number when user is typing so that in shouldChangeCharactersIn function;

  • I get user input (string)
  • Add that input to text which is already written in UITextField
  • Mask text and set it to UITextField
  • Return false

My question is that after I set text of UITextfield (delete or add new character UITextField, cursor moves to the end but I want it to stay in the same position. (By meaning same position, I mean same when I don’t implement shouldChangeCharactersIn function) How can I do that? Thank you.

func textField(_ textField: UITextField, shouldChangeCharactersIn range: NSRange, replacementString string: String) -> Bool {

    guard CharacterSet(charactersIn: "0123456789").isSuperset(of: CharacterSet(charactersIn: string)) else {
        return false
    }
    if let text = textField.text {
        let newLength = text.count + string.count - range.length
        let newText = text + string

        let textFieldText: NSString = (textField.text ?? "") as NSString
        let txtAfterUpdate = textFieldText.replacingCharacters(in: range, with: string)
            if(newLength <= 15){
                //textField.text = txtAfterUpdate
                textField.text = txtAfterUpdate.digits.applyPatternOnNumbers()
                return false
            }
            return newLength <= 15
    }
    return true
}

Mask Function:

func applyPatternOnNumbers(pattern: String = "(###) ### ## ##", replacmentCharacter: Character = "#") -> String {
var pureNumber = self.replacingOccurrences( of: "[^0-9]", with: "", options: .regularExpression)
for index in 0 ..< pattern.count {
    guard index < pureNumber.count else { return pureNumber }
    let stringIndex = String.Index(encodedOffset: index)
    let patternCharacter = pattern[stringIndex]
    guard patternCharacter != replacmentCharacter else { continue }
    pureNumber.insert(patternCharacter, at: stringIndex)
}
return pureNumber
}

What I want in GIF

enter image description here


Get this bounty!!!

#StackBounty: #swift #ios #cocoa-touch Presenting and passing data to a modal view controller without using prepare(for:sender:) method

Bounty: 50

I am using a toolbar button to present a modal view controller (in which I let the user export data as a PDF file). The main section of my app is a UITableViewController subclass embedded in a UINavigationController.

Here is a schematic of my layout.

Schematic of my storyboard layout

The modal itself is embedded in a UINavigationController as I need it to have a bottom toolbar. It also has a transparent background and is presented using .overCurrentContext, so the main screen of the user’s data blurs underneath.

I found that to get it to float over everything else (including the navigation bar etc), I had to present it from the UINavigationController (otherwise the main navigation bar and toolbar appeared on top of it).
The problem with this is that the UITableViewController method prepare(for:sender:) is not called.

I call the segue to the modal view controller like this (from the UITableViewController subclass):

// User taps EXPORT button
@objc func exportButtonTapped(_ sender: UIBarButtonItem) {
    self.navigationController?.performSegue(withIdentifier: "showExport", sender: nil)
}

In order to transfer the array of user data to the modal view controller, I have called the following code in the modal view controller:

override func viewDidLoad() {
    super.viewDidLoad()
    // Get data from array in main table view controller
    let masterNav = navigationController?.presentingViewController as! UINavigationController
    let myTableVC = masterNav.topViewController as! MyTableViewController
    self.userData = myTableVC.userData // This is of type: [MyObject]
} 

The data is then rendered to a PDF (using HTML templating) in the modal view controller’s viewWillAppear() method. This works as expected.

However, I have some concerns about doing it this way:

  1. Is it guaranteed that viewDidLoad() will finish before viewWillAppear() is called? Will an even a larger data set be available for rendering as PDF in viewWillAppear()?
  2. Is it acceptable to present modally from the UINavigationController?
  3. Should I be subclassing the main UINavigationController and using its prepare(for:sender:) method (if this is even an option)?
  4. In the performSegue(withIdentifier:sender:) method, does the sender argument make any difference?
  5. Is it preferable to use present() rather than a segue?

I would of course be grateful for any other advice or refinements to the code. It seems to work as expected; I just want to make sure I won’t run into problems in the future. Thank you.


Get this bounty!!!