#StackBounty: #ios #swift #vpn #networkextension #l2tp Implementing VPN with L2TP protocol in iOS app

Bounty: 100

In iOS settings there are options to create VPN configuration using IPSec, IKEv2 and L2TP. Using NetworkExtension framework from Apple there’s option to create VPN using IPSec and IKEv2 protocols only. They do work but problem is that I need to create connection via L2TP since that’s only supported by company’s firewall.

There’s question iOS app with custom VPN connect from 2014 and it’s answered with:

If you want to connect programatically in ios 8 you can use only IPSec or IKEv2 protocols. L2TP and PPTP protocols are private for apple. It is not possible to use L2TP and PPTP APIs in your applications. Only Apple is currently using these APIs.

Is there any way to create L2TP VPN connection from iOS application (Swift)?


Get this bounty!!!

#StackBounty: #ios #swift #xcode #ios-charts Returning to xcode/swift/ios project – suddenly get "no such module"

Bounty: 100

I think xcode is having a laugh at me.

I open my old project that uses charts library
https://github.com/danielgindi/Charts/

and get error

No such module Charts

where I use

import Charts

I then tried to remove Charts from my project. Downloaded new project. And dragged Charts.xcodeproj file into my project – no change

The way my files physically are organized is:

 xcode-ios-projects
 - charts-ios
 - yourappplaform
 -- appcodegeneric (my source code and different .plist files for different apps)
 -- different-apps-asset-folder-1
 -- different-apps-asset-folder-2
 -- different-apps-asset-folder-x

Inside xcode IDE structure looks like this

appgeneric
- Charts.xcodeproj
- appcodegeneric
-- different-apps-asset-folder-1
-- different-apps-asset-folder-2
-- different-apps-asset-folder-x

I have updated xcode, charts etc. to newest. I have not yet updated to Swift 4 conversion process

It has been a long time since I touched my ios/swift project, so maybe I am missing something obvious – but this problem seems a bit odd to me.

UPDATE
After using “Clean” I now get the following error when I “Build”:

Module compiled with Swift 3.0.2 cannot be imported in Swift 3.3:
/Users/myname/Library/Developer/Xcode/kukuhkhkuhkuh/Build/Products/Debug-iphonesimulator/Charts.framework/Modules/Charts.swiftmodule/x86_64.swiftmodule

However – after cleaning that folder manually in Finder… I know get the same error as in the beginning:

no such module ‘Charts’


Get this bounty!!!

#StackBounty: #ios #swift #camera #core-graphics #avcapturemoviefileoutput How to compose AVCaptureVideoPreviewLayer with other views o…

Bounty: 500

I have managed to setup a basic AVCaptureSession which records a video and saves it on device by using AVCaptureFileOutputRecordingDelegate. I have been searching through docs to understand how we can add statistics overlays on top of the video which is being recorded.

i.e.

enter image description here

As you can see in the above image. I have multiple overlays on top of video preview layer. Now, when I save my video output I would like to compose those views onto the video as well.

What have I tried so far?

  • Honestly, I am just jumping around on internet to find a reputable blog explaining how one would do this. But failed to find one.
  • I have read few places that one could render text layer overlays as described in following post by creating CALayer and adding it as a sublayer.
  • But, what about if I want to render MapView on top of the video being recorded. Also, I am not looking for screen capture. Some of the content on the screen will not be part of the final recording so I want to be able to cherry pick view that will be composed.

What am I looking for?

  1. Direction.
  2. No straight up solution
  3. Documentation link and class names I should be reading more about to create this.

Progress So Far:

I have managed to understand that I need to get hold of CVImageBuffer from CMSampleBuffer and draw text over it. There are things still unclear to me whether it is possible to somehow overlay MapView over the video that is being recorded.


Get this bounty!!!

#StackBounty: #ios #swift #uitableview #uiscrollview Can UITableView scroll with UICollectionView inside it?

Bounty: 50

I have the structures below…

enter image description here

I wrap two of collection views into tableview

One is in tableview header(Collection1), another is in tableview 1st row(Collection2).

All the functions are good (Both collection view).

just…

When I scroll up in Collection2, Collection1 will Not scroll up together, because I’m only scrolling the collectionViews not the tableview.

It only scroll together when I scroll in Collection1.

Is it possible to make the header view scroll with user just like app store’s index carousel header?

Or I just went to the wrong place, I should use other ways to approach.


Get this bounty!!!

#StackBounty: #swift #xcode #animation #uiviewanimation #framerjs Determine springWithDamping and initialSpringVelocity based off from …

Bounty: 100

My design team gives us animation parameters using friction and tension. For instance:

Has a spring affect (280 tension and 20.5 friction) Over 0.3 seconds

Unfortunately, i’ve always guessed what these values convert to, and eyeball it, if it looks close I send it over and they approve it. But the time that it takes to constantly build the project with different values is time consuming. There has to be an easier way.

I found Framer on Github and it’s led me to believe that the damping can be calculated like so:

let damping: CGFloat = 20.5 / (2 * (1 * 280)).squareRoot()

However, I cannot seem to figure out how to calculate the velocity based off from friction and tension. Is there an easier way to save this developer some valuable time?

Example of animation:

UIView.animate(withDuration: 0.3, delay: 0, usingSpringWithDamping: damping,
                       initialSpringVelocity: ???, options: .curveEaseIn, animations: { // Do Stuff })


Get this bounty!!!

#StackBounty: #ios #swift #avfoundation #avcomposition #avvideocomposition AVPlayer resizeAspect works only properly on iPhone X

Bounty: 50

resizeAspect as the video gravity only works properly for me, when using an iPhone X.

For some reasons, the black aspect bar gets only added to the top and not to the bottom. This is how it looks like when I’m not using an iPhone X (the image is white)

iphone

This is how it should look like:

normal state

As you can see, on the iPhone X, everything looks clean and balanced as expected.

This is how I play the video:

    avPlayerLayer = AVPlayerLayer(player: avPlayer)
    avPlayerLayer.frame = PreviewLayer.bounds
    avPlayerLayer.videoGravity = .resizeAspect //Will automatically add black bars


    PreviewLayer.layer.insertSublayer(avPlayerLayer, at: 0)
    let playerItem = AVPlayerItem(url: video)
    avPlayer?.replaceCurrentItem(with: playerItem)
    avPlayer?.play()


Get this bounty!!!

#StackBounty: #ios #swift #xcode #cocos2d-iphone How to delete photos added in specific albums but not in others?

Bounty: 100

I wonder in swift and xcode: how do I delete photos added in specific albums but not in others. For example photos added in camera rolls but not in other albums? I know in principle, apple make it only a reference link of the same photo in camera rolls in other album, so if delete in one, you will delete in all others. But would it be possible to make any way around it? For example copy it as a file copy to new album with different names?
Any one have any idea how to work around with it ?

Thank you,


Get this bounty!!!

#StackBounty: #ios #swift #usernotifications #unusernotification User notification are restricted to a limit in iOS app how to send unl…

Bounty: 50

I am working on a alarm app project(Click for github link) this code is working fine but problem is notifications are limited to approximately 64 at time so I am unable to send continues notification until user respond to notification. I read some the apple restricts local notification to only 64 but I have seen many apps on apple store those are sending notification continuously here few links from apple store.

https://itunes.apple.com/us/app/red-clock-free-edition-the-minimal-alarm-clock/id567374573?mt=8
https://itunes.apple.com/us/app/alarmy-alarm-clock/id1163786766?mt=8

Anyone help to understand how these app are able to send continuous and I tried these app are sending notification until user respond to app (for more then 1 hr I checked). Below is the code for setting single notification.

    let comingSunday =  findNext("Sunday", afterDate: fireDate.adding(minutes: item))
    let triggerDate = Calendar.current.dateComponents([ .year, .month, .weekday, .hour, .minute, .second], from: comingSunday!)
    let trigger = UNCalendarNotificationTrigger(dateMatching: triggerDate, repeats: alarm.repeatAlarm)

    let request = UNNotificationRequest(identifier: "(alarm.uuid)0(item)", content: notificationContent, trigger: trigger)
    UNUserNotificationCenter.current().add(request) { (error) in
                         if let error = error {
                           print("Unable to add notification request, (error.localizedDescription)")
                         }
    }


Get this bounty!!!