#StackBounty: #ios #tensorflow How do I obtain the layer names for use in the iOS sample app? (Tensorflow)

Bounty: 50

I’m very new to Tensorflow, and I’m trying to train something using the inception v3 network for use in an iPhone app. I managed to export my graph as a protocolbuffer file, manually remove the dropout nodes (correctly, I hope), and have placed that .pb file in my iOS project, but now I am receiving the following error:

Running model failed:Not found: FeedInputs: unable to find feed output input

which seems to indicate that my input_layer_name and output_layer_name variables in the iOS app are misconfigured.

I see in various places that it should be Mul and softmax respectively, for inception v3, but these values don’t work for me.

My question is: what is a layer (with regards to this context), and how do I find out what mine are?

This is the exact definition of the model that I trained, but I don’t see “Mul” or “softmax” present.

This is what I’ve been able to learn about layers, but it seems to be a different concept, since “Mul” isn’t present in that list.

I’m worried that this might be a duplicate of this question but “layers” aren’t explained (are they tensors?) and graph.get_operations() seems to be deprecated, or maybe I’m using it wrong.


Get this bounty!!!

#StackBounty: #ios #xcode #compiler-errors #build-error XCode 8.3.1 Missing default C++ ObjectiveC compiler. Cannot build projects

Bounty: 50

After upgrading to XCode 8.3.1 I am getting an error:

Unsupported compiler 'com.apple.compilers.llvm.clang.1_0' selected for architecture 'x86_64'

screenshot of the error

Suggestions to set the compiler as default does not help, since the XCode does not see default compiler:

screenshot that there is no default compiler

Is there any solution?

Upd:

Actually, there is the compiler in the system:

terminal screenshot

Upd 2:

And one more screenshot with errors explained

errors screenshot


Get this bounty!!!

#StackBounty: #ios #iphone #watchkit #apple-watch Approaching Size Limit – The size of watch application

Bounty: 50

Our app reaches approx 49MB and we are not half way. So definitely it will exceed limit of 50MB. I have few questions as follow.

1) Is On-Demand Resources possible in watchOS?

2) My resources(images, custom fonts) make 2 copy each one is watch app and other is watch extension. How to solve that?

3) Swift core and other swift frameworks are consuming about ~28MB of space. is there any way to disable that? (PS. we have bitcode disabled)


Get this bounty!!!

#StackBounty: #ios #provisioning-profile What is the Team Provisioning I have from scratch?

Bounty: 50

I understand the provisioning profiles something like this: a provisioning profile contains signing identity information, and used to sign an application code. I can develop an application using simulators without any provisioning profiles. To run and test an application on real devices, I need at least development provisioning profile. These profiles should be generated in Apple Developer Portal and then downloaded to use them locally with Xcode. Currently Xcode itself is responsible for generating and downloading profiles, and Xcode does it automatically. There are also distribution provisioning profiles Ad-Hoc and AppStore.

But I’m confused by so-called “Team Provisioning Profile” used to sign the application in Xcode by default. It appears automatically and allows me to install application on the real devices, so I understand it as an automatically generated development provisioning profile. But I can’t see any corresponding iOS-development provisioning profile in the Apple Developer console.

My question is: What is the Team Provisioning Profile then? If it’s kind of Development Provisioning, why can’t I see it on Apple Developer console?


Get this bounty!!!

#StackBounty: #ios #swift Responder chain – different behavior with touchesMoved vs. touchesBegan?

Bounty: 50

Let’s say you create two view controllers, A and B. A has a segue to B (which segue specifically I don’t think matters here, as the outcome seems to be the same. I’ll use push as example.). A has the following implementation:

class A: UIViewController {

  override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
      print("A received a begin touch")
  }
  override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
      print("A received a move touch")
  }
}

In B, you have:

class B: UIViewController {

  override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
      print("B received a begin touch")
  }
}

This here will block any touches from going to ViewController A. Even when there is a move touch, A does not recieve it.

However, if instead the code for B was:

class B: UIViewController {

  override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) {
      print("B received a move touch")
  }
}

Then A’s “began touch” prints, and both A and B’s “move touch” prints. So to summarize, when implementing just touchesBegan in class B, the only thing printing to the console is “B received a begin touch”. When implementing just touchesMoved in class B, the console prints “A received a begin touch” followed by an alternating pattern of both A and B receiving a move touch.

So what’s the reason for this difference? Why does touchesBegan override in B stop the touchesMoved method from firing in A? And why doesn’t the touchesMoved method in B stop the touchesMoved method from firing in A? I saw in the docs that you must override all touch methods if you don’t call super, but I still don’t get why that would be necessary here.


Get this bounty!!!

#StackBounty: #android #ios #bonjour #multipeer-connectivity #dns-sd Android & iOS dns-sd: how to do a cross platform service disco…

Bounty: 50

On Android I’m doing this way to create a service broadcast with a simple attacched map as info

HashMap<String, String> record = new HashMap<>();
record.put("info, "my android info");    
WifiP2pDnsSdServiceInfo serviceInfo = WifiP2pDnsSdServiceInfo
        .newInstance("_myservice", "_tcp", record);
WifiP2pManager manager = (WifiP2pManager) getSystemService(Context.WIFI_P2P_SERVICE);
Channel channel = manager.initialize(this, this.getMainLooper(), null);
manager.addLocalService(channel, serviceInfo, null);

to retrieve another signal

manager.setDnsSdResponseListeners(channel, this, this);

@Override
public void onDnsSdTxtRecordAvailable(String fullDomainName, Map<String, String> txtRecordMap, WifiP2pDevice srcDevice)

On iOS it’s the same service name and map of info

MCPeerID* _myPeerId=[[MCPeerID alloc] initWithDisplayName:@"my ios peer"];
NSDictionary* _serviceInfo=@{@"info": @"my ios info"};

MCNearbyServiceAdvertiser* _serviceAdvertiser = [[MCNearbyServiceAdvertiser alloc] 
         initWithPeer:_myPeerId discoveryInfo:_serviceInfo serviceType:@"myservice"];
_serviceAdvertiser.delegate=advertiseDelegate;

MCNearbyServiceBrowser* _serviceBrowser=[[MCNearbyServiceBrowser alloc] 
      initWithPeer:_myPeerId serviceType:@"myservice"];
_serviceBrowser.delegate=browserDelegate;

[_serviceAdvertiser startAdvertisingPeer];
[_serviceBrowser startBrowsingForPeers];

and the delegates

-(void)browser:(MCNearbyServiceBrowser *)browser lostPeer:(MCPeerID *)peerID
-(void)browser:(MCNearbyServiceBrowser *)browser foundPeer:(MCPeerID *)peerID withDiscoveryInfo:(NSDictionary<NSString *,NSString *> *)info

An Android device can see only another Android device, likewise between iOS devices. However, the two families can’t see each other.
I’d like to have them seen by other devices regardless of the operating system.


Get this bounty!!!

#StackBounty: #ios #swift #avaudiosession Adding AVCaptureDeviceInput to CaptureSession resets/refocuses video

Bounty: 100

A video recording app. I want it to work without stopping/pausing background music (when user listens to Apple Music for instance). This I can do nicely with setting category to ‘mixWithOthers’ on AVAudioSession singleton.

After setting the category I also need to add AVCaptureDeviceInput to AVCaptureSession (so audio will get recorded). This results a glitch/hiccup to background audio and also video resets/refocuses.

I have investigated and it seems background audio glitch is something that can’t be avoided, but video should not reset itself when input is added. The result of video resetting is that the first frame of the recorded video is either dark/black or it starts with out of focus frame and then focuses.

Also checked Snapchat ios app and they also have audio glitch when starting recording, but video starts to record smoothly. What am I doing wrong.

My code:

    //Setting audio session to mixWithOthers upon startup
    let session = AVAudioSession.sharedInstance()

    do {
        try session.setCategory(AVAudioSessionCategoryPlayAndRecord,
                                with: [.mixWithOthers])
        if session.mode != AVAudioSessionModeVideoRecording {
            try session.setMode(AVAudioSessionModeVideoRecording)
        }
    } catch let error {
        print("avsession category error: (error)")
    }

And then:

    //Just before recording starts will add audio input
    let audioDevice = AVCaptureDevice.defaultDevice(withMediaType: AVMediaTypeAudio)
    do
    {
        let deviceInput = try AVCaptureDeviceInput(device: audioDevice) as AVCaptureDeviceInput
        if captureSession.canAddInput(deviceInput) {
            captureSession.addInput(deviceInput)
        }
        else {
            print("Could not add audio device input to the session")
        }


    }
    catch let error as NSError {
        print(error.localizedDescription)
        return
    }

Would it be possible to do this without any glitches at all?
If not then how could I make it at least like Snapchat (no video reset upon addInput call)?


Get this bounty!!!

#StackBounty: #ios #objective-c #ipad #nsurlsessiondownloadtask NSURLSession background download not working

Bounty: 50

I am trying to download a number of files using NSURL background session with nsurlsessiontask. Everything works like charm when the app is running in debugging mode (when the device is connected to Xcode), doesn’t work when unplugging device (iPad) from Xcode.

I am using Xcode 7.3.1 with iOS 9.3.5.I have already spent weeks tracing this strange behavior but haven’t got any breakthroughs. May be I missing something to implement background download.
Recently upgraded Xcode to 8.1.2 and iOS to 10.2.1 assuming upgradation might solve the issue but it is not.


Get this bounty!!!

#StackBounty: #ios #objective-c #uiscrollview #uitabbarcontroller Automatically adjust content insets for views with custom a tab bar

Bounty: 50

I created a custom tab bar controller. It works very similar to UITabBarController but with a more advanced layout for the UITabBar.

How do I adjust the bottom content insets for views that appear in my custom tab bar controller? For instance if I show a UITableView in my custom tab bar controller I can manually adjust the content insets like this.

self.tableView.contentInset = UIEdgeInsetsMake(0, 0, 49, 0);
self.tableView.scrollIndicatorInsets = UIEdgeInsetsMake(0, 0, 49, 0);

However the problem gets more complicated if I push another view onto this table view like a collection view. Is there a way to get these views to automatically adjust their content insets like they do in the default UITabBarController implementation?


Get this bounty!!!

#StackBounty: #ios #avplayer #background-process #avplayeritem AVPlayer Not Loading Media In Background

Bounty: 50

When running in the background, my implementation of the AVPlayer is unable to play audio (e.g. podcast) that is downloaded, but is able to play songs that are stored locally. Failure to play in the background is only when the phone is disconnected from my computer. If my phone is direct connected to my computer/debugger, any media that is local or is downloaded plays without a problem. In the foreground, there is also no problem playing either media type.

Here is my implementation:

AVPlayer                    *moviePlayer;               
AVPlayerItem                *playerItem;
NSURL *address = /* the url of either a local song or remote media */

if (moviePlayer != nil) {
    NSLog(@"removing rate, playbackBufferEmpty, playbackLikelyToKeepUp observers before creating new player");
    [moviePlayer removeObserver:self forKeyPath:@"rate"];
    [playerItem removeObserver:self forKeyPath:@"playbackBufferEmpty"];
    [playerItem removeObserver:self forKeyPath:@"playbackLikelyToKeepUp"];
    [playerItem removeObserver:self forKeyPath:@"status"];
    [[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemDidPlayToEndTimeNotification object:playerItem];
    [[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemFailedToPlayToEndTimeNotification object:playerItem];
    [[NSNotificationCenter defaultCenter] removeObserver:self name:AVPlayerItemPlaybackStalledNotification object:playerItem];
    [self setMoviePlayer:nil];  // release and nilify
}

// The following block of code was an experiment to see if starting a background task would resolve the problem. As implemented, this did not help.
if([[UIApplication sharedApplication] applicationState] != UIApplicationStateActive)
{
    NSLog(@"Experiment. Starting background task to keep iOS awake");
    task = [[UIApplication sharedApplication] beginBackgroundTaskWithExpirationHandler:^(void) {
    }];
}

 playerItem = [[AVPlayerItem alloc]initWithURL:address];
 moviePlayer = [[AVPlayer alloc]initWithPlayerItem:playerItem];

 // Add a notification to make sure that the player starts playing. This is handled by observeValueForKeyPath
 [moviePlayer addObserver:self
               forKeyPath:@"rate"
                  options:NSKeyValueObservingOptionNew
                  context:nil];

[playerItem addObserver:self forKeyPath:@"playbackBufferEmpty" options:NSKeyValueObservingOptionNew context:nil];
[playerItem addObserver:self forKeyPath:@"playbackLikelyToKeepUp" options:NSKeyValueObservingOptionNew context:nil];
[playerItem addObserver:self forKeyPath:@"status" options:NSKeyValueObservingOptionNew context:nil];

 // The following 2 notifications handle the end of play
 [[NSNotificationCenter defaultCenter] addObserver:self
                                          selector:@selector(moviePlayBackDidFinish:)
                                              name:AVPlayerItemDidPlayToEndTimeNotification
                                            object:playerItem];

 [[NSNotificationCenter defaultCenter] addObserver:self
                                          selector:@selector(moviePlayBackDidFinish:)
                                              name:AVPlayerItemFailedToPlayToEndTimeNotification
                                            object:playerItem];

[[NSNotificationCenter defaultCenter] addObserver:self
                                         selector:@selector(moviePlayBackStalled:)
                                             name:AVPlayerItemPlaybackStalledNotification
                                           object:playerItem];

 // Indicate the action the player should take when it finishes playing.
 moviePlayer.actionAtItemEnd = AVPlayerActionAtItemEndPause;

 moviePlayer.automaticallyWaitsToMinimizeStalling = NO;

UPDATE: In the above implementation, I am also showing an experimental attempt to start a background task in hopes of enabling the AVPlayer to play a podcast in the background. This did not help either, but I include it for reference. Not shown, but I also end the background task after the AVPlayerItem playbackLikelyToKeepUp status changes to TRUE.

Then I have the following code to handle the keyPath notifications:

-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context {
    if ([keyPath isEqualToString:@"rate"]) {
        NSString *debugString = [NSString stringWithFormat: @"In observeValueForKeyPath: rate"];
        DLog(@"%@", debugString);
        debugString = [self appendAvPlayerStatus: debugString];
        [[FileHandler sharedInstance] logDebugString:debugString];
    }
    else if (object == playerItem && [keyPath isEqualToString:@"playbackBufferEmpty"]) {

        NSString *debugString = [NSString stringWithFormat: @"In observeValueForKeyPath: playbackBufferEmpty"];
        DLog(@"%@", debugString);
        debugString = [self appendAvPlayerStatus: debugString];
        [[FileHandler sharedInstance] logDebugString:debugString];
    }
    else if (object == playerItem && [keyPath isEqualToString:@"playbackLikelyToKeepUp"]) {

        NSString *debugString = [NSString stringWithFormat: @"In observeValueForKeyPath: playbackLikelyToKeepUp"];
        DLog(@"%@", debugString);
        debugString = [self appendAvPlayerStatus: debugString];
        [[FileHandler sharedInstance] logDebugString:debugString];
    }

    else if (object == playerItem && [keyPath isEqualToString:@"status"]) {

        NSString *debugString = [NSString stringWithFormat: @"In observeValueForKeyPath: status"];
        DLog(@"%@", debugString);
        debugString = [self appendAvPlayerStatus: debugString];
        [[FileHandler sharedInstance] logDebugString:debugString];
    }
}

And I have the following to handle notificationCenter notifications:

- (void) moviePlayBackDidFinish:(NSNotification*)notification {

    NSLog(@"moviePlaybackDidFinish. Time to stopMoviePlayerWithMusicPlayIndication");
    [self stopMoviePlayer: YES];  // stop the movie player 
}

- (void) moviePlayBackStalled:(NSNotification*)notification {

    NSString *debugString = [NSString stringWithFormat: @"In moviePlayBackStalled. Restarting player"];
    DLog(@"%@", debugString);
    debugString = [self appendAvPlayerStatus: debugString];
    [[FileHandler sharedInstance] logDebugString:debugString];
    [moviePlayer play];
}

By implementing a logging tool to trace execution when disconnected from the computer, here is what I am observing:

When running in the background disconnected from computer, the playerItem is loaded with the url address and the AVPlayer is initialized with the playerItem. This causes the notification observeValueForKeyPath: rate to be posted, but that is the last notification received. Audio does not play. The player just hangs. The output from my log showing various moviePlayer and playerItem flags is as follows:

ready to play movie/podcast/song dedication/Song from url: http://feedproxy.google.com/~r/1019TheMix-EricAndKathy/~5/ZZnF09tuxr0/20170309-1_1718764.mp3
In observeValueForKeyPath: rate - timeCntrlStatus: AVPlayerTimeControlStatusPlaying, itemStatus: AVPlayerItemStatusUnknown, playbackToKeepUp: 0, playbackBufferEmpty: 1, playbackBufferFull: 0, rate: 1.0

However, when running in the background when directly connected to the computer or when running in the foreground, you can see from the log output below that after the url address is loaded and and the AVPlayer is initialized with the playerItem, a series of notifications are posted for keyPath: rate, playbackBufferEmpty, playbackLikelyToKeepUp, and status. Audio then starts playing. The output showing various moviePlayer and playerItem flags is as follows:

ready to play movie/podcast/song dedication/Song from url: http://feedproxy.google.com/~r/1019TheMix-EricAndKathy/~5/d3w52TBzd88/20170306-1-16_1717980.mp3
In observeValueForKeyPath: rate - timeCntrlStatus: AVPlayerTimeControlStatusPlaying, itemStatus: AVPlayerItemStatusUnknown, playbackToKeepUp: 0, playbackBufferEmpty: 1, playbackBufferFull: 0, rate: 1.0
In observeValueForKeyPath: playbackBufferEmpty - timeCntrlStatus: AVPlayerTimeControlStatusPlaying, itemStatus: AVPlayerItemStatusUnknown, playbackToKeepUp: 0, playbackBufferEmpty: 0, playbackBufferFull: 0, rate: 1.0
In observeValueForKeyPath: playbackLikelyToKeepUp - timeCntrlStatus: AVPlayerTimeControlStatusPlaying, itemStatus: AVPlayerItemStatusUnknown, playbackToKeepUp: 0, playbackBufferEmpty: 0, playbackBufferFull: 0, rate: 1.0
In observeValueForKeyPath: status - timeCntrlStatus: AVPlayerTimeControlStatusPlaying, itemStatus: AVPlayerItemStatusReadyToPlay, playbackToKeepUp: 0, playbackBufferEmpty: 0, playbackBufferFull: 0, rate: 1.0
In observeValueForKeyPath: playbackLikelyToKeepUp - timeCntrlStatus: AVPlayerTimeControlStatusPlaying, itemStatus: AVPlayerItemStatusReadyToPlay, playbackToKeepUp: 1, playbackBufferEmpty: 0, playbackBufferFull: 0, rate: 1.0
In observeValueForKeyPath: rate - timeCntrlStatus: AVPlayerTimeControlStatusPlaying, itemStatus: AVPlayerItemStatusReadyToPlay, playbackToKeepUp: 1, playbackBufferEmpty: 0, playbackBufferFull: 0, rate: 1.0

So in summary, you see above that when running in foreground or in background if directly connected to the computer/debugger, the AVPlayer successfully loads the playback buffer and plays the audio. But when running in the background and NOT connected to the computer/debugger, the player does not appear to load the media and just hangs.

In all of the cases, the AVPlayerItemPlaybackStalledNotification is never received.

Sorry for the long explanation. Can anyone see what might be causing the player to hang in the background when not connected to the computer?


Get this bounty!!!