Swift Playground didn't recognize structs in other folders
I'm using AVAudioEngine of AVFoundation with SwiftUI in Swift Playground and using MVVM as Architecture. But if I separate my structs in files, I receive two errors: Execution was interrupted, reason: signal SIGABRT. and Cannot find file in scope. The only way that all works is using them in the same file.
Here is the playground base code, where I call MusicCreatorView.
import SwiftUI
import PlaygroundSupport
public struct StartView: View {
public var body: some View {
ZStack {
Rectangle()
.fill(Color.white)
.frame(width: 400, height: 400, alignment: .center)
NavigationView {
VStack {
NavigationLink("Start", destination: MusicCreatorView())
}
}
}
}
}
PlaygroundPage.current.setLiveView(StartView()) // error: Execution was interrupted, reason: signal SIGABRT.
AudioEngine can be found by StartView (if called at StartView, just move SIGABRT error from PlaygroundPage.current.setLineView() to where it is called) but can't be found by MusicCreatorView.
import Foundation
import AVFoundation
public struct AudioEngine {
public var engine = AVAudioEngine()
public var player = AVAudioPlayerNode()
public var audioBuffer: AVAudioPCMBuffer?
public var audioFormat: AVAudioFormat?
public var audioFile: AVAudioFile? {
didSet {
if let audioFile = audioFile {
audioFormat = audioFile.fileFormat
}
}
}
public var audioFileURL: URL? {
didSet {
if let audioFileURL = audioFileURL {
audioFile = try? AVAudioFile(forReading: audioFileURL)
}
}
}
public init() {
setupAudio()
}
public mutating func setupAudio() {
audioFileURL = Bundle.main.url(forResource: "EverybodysCirculation", withExtension: "mp3")
guard let format = audioFormat else { return }
engine.attach(player)
engine.connect(player, to: engine.mainMixerNode, format: format)
engine.prepare()
do {
try engine.start()
} catch {
print(error.localizedDescription)
}
}
public func scheduleAudioFile() {
guard let audioFile = audioFile else { return }
player.scheduleFile(audioFile, at: nil, completionHandler: nil)
}
public func playSound() {
player.isPlaying ? player.pause() : player.play()
}
}
Trying to call AudioEngine at MusicCreatorView
import Foundation
import SwiftUI
import AVFoundation
public struct MusicCreatorView: View {
var audioEngine = AudioEngine() // Cannot find 'AudioEngine' in scope
public init() {}
public var body: some View {
Text("Try to Create your own music")
Button("play") {
print("apertou")
audioEngine.playSound() // audioEngine <<error type>>
}
}
}
Here is how my files are organized https://i.stack.imgur.com/losWf.png
1 answer
-
answered 2021-04-08 03:32
matt
Multiple files in Sources cannot see each other. They are just independent libraries available to the actual playground. You should be developing this in a real iOS project, rather than a playground.
See also questions close to this topic
-
create a modal popup view on tap on callout of the annotation, MKMapView in Swift
I've coded the following screens in swift, with a map, annotations on it and custom callout when tap on the annotation as shown below
also added the code to open a custom viewcontroller when the button in the callout is tapped :
But the custom view overlaps the screen, also blocks user to interact with the map because the map view is backed. User has to swipe down the view to go back to the map.
How can I open the view as shown below (like airbnb map screen) and also let user is able to keep using the map while the view is still on the screen (bottom, top etc.), as shown in below
Thank you
-
Memory Allocations Profiler and Steadily Increasing Persistent Memory - Signs of Trouble?
I have an app I am developing and the stakeholder using it said that the app becomes slow and unusable/unresponsive after consistent usage all day. Killing it and starting over causes it to run fine.
I don't seem to have this trouble on my device, but I started looking at the memory usage in both simulator/phone in debugger, and observed my memory would steadily increase if I took the basic action of going between screen to screen. These are pretty involved screens, but if I just go forward to the 'add new item' screen, then back to the product listing screen, the memory jumps up 30mb. If I keep doing this same action, over and over and over, I can get it to 1.1gb of memory
I then took it a step further, hooked up my phone, and ran profiler (specifically memory leaks). I found one leak involving my usage of ads, so I just commented out all the code for a test and while the leaks are gone, the memory continues to go up steadily.
I then ran the allocations tool, and after a few min of going back and forth in the same manner, here is the output:
As you can see, it's 1.53GB and if I kept doing the same action I can get it to 2GB+. Oddly enough, my phone never seems to mind, and the screens are just slightly laggy at times otherwise not too bad. Certainly usable.
Before I start ripping out the floor boards, I wanted to confirm this is a likely sign of a problem. Any suggestions on where I can start looking? If persistent memory is the issue, what would be some typical gotchas or pitfalls? What is "anonymous vm?"
Thank you so much if you're reading this far, and appreciate any guidance!
UPDATE/EDIT
After some guidance here, I noticed, oddly enough, that on the "add product" page it causes the memory to jump ~10MB each time I visit it. After commenting out code, I narrowed it down to this section (and even the line of code) causing the jump. Removing this code causes it to remain stable and not increase.
//Render collection views func collectionView(_ collectionView: UICollectionView, cellForItemAt indexPath: IndexPath) -> UICollectionViewCell { let cell = collectionView.dequeueReusableCell(withReuseIdentifier: "cell", for: indexPath as IndexPath) let member: MemberDto = groupMembers[indexPath.item] let contactInitials = cell.viewWithTag(1) as! UILabel let contactAvatar = cell.viewWithTag(2) as! UIImageView contactAvatar.image = UIImage(named: "anonymous") contactInitials.text = member.displayName contactAvatar.layer.cornerRadius = contactAvatar.frame.size.width / 2 contactAvatar.clipsToBounds = true contactAvatar.contentMode = UIViewContentMode.scaleAspectFill contactAvatar.layer.borderWidth = 5.0 if (member.profileImage.trimmingCharacters(in: CharacterSet.whitespaces) != "") { UserService.getProfilePicture(userId: member.userId) { response in contactAvatar.image = response.value } }
So, the offending line of code is here:
contactAvatar.image = response.value
Adding it in, and going back and forth to this tableviewcontroller causes the memory to go up and up and up all the way to 2gb. Removing that one line of code (where I set the image) keeps it stable at ~40-70mb, or it goes up but very very slowly (dozens of repeats only got it to 80mb)
I realized I was not caching this image
I decided to try caching this with my framework, and that immediately resolved the issue. I suppose the line of code was pulling the image into memory or something like that? It doesn't seem like the networking call is the actual issue, since I left that in (and even went so far to make additional calls to my API) and that doesn't seem to do much by way of memory increase.
Just a few pieces of info:
- From the main screen, you tap on a + symbol in the navigation menu bar to come to this screen.
- I am using a regular segue on my storyboard, associated with the navigationbutton, to take the user here
- Placing deinit on this vc does not seem to ever hit, even with print/code in there and breakpoints
- Making API calls from within my uitableviewcontroller doesn't seem to cause the image to load UNLESS I combine that with SETTING the image. If I make a network call, but don't set the image, it doesn't increase.
What mistake did I make? I feel like caching the image is a bandaid - I recall reading that you're not supposed to make calls to images within a UITableViewController but what is the alternative, to pull all user images from the collection in advance and cache them before the tableview loads?
Thanks!
-
Swift OS X disable/hide button in view controller if download is in progress from table view in separate classes
I have a table of products and each has a download button. Each button is its own NSTableCellView in a separate class. I want to disable/hide a button in the original Product View Controller class if a download is in progress. But whenever I try to do that my app crashes with very little error messaging as to why. Is there anyway to accomplish my goal?
let viewCon = ProductViewController() func urlSession(_ session: URLSession, downloadTask: URLSessionDownloadTask, didWriteData bytesWritten: Int64, totalBytesWritten: Int64, totalBytesExpectedToWrite: Int64) { let filesize: Int64 = Int64(passedLongBytes)! let percentage = CGFloat(totalBytesWritten) / CGFloat(filesize) DispatchQueue.main.async{ self.shapeLayer.strokeEnd = percentage print("PERCENTAGE: \(Int(percentage * 100))%") if((Int(percentage * 100) < 100)){ self.viewCon.backButtonOutlet.isHidden = true }else{ self.viewCon.backButtonOutlet.isHidden = false } } }
I just get an error message everything "Thread 1: EXC_BAD_INSTRUCTION (code=EXC_I386_INVOP, subcode=0x0)"
-
How to Add Color beside text into the list in swiftUI?(Data Flow)
I try to get( text & color ) from user and add them to the list in SwiftUI I already can pass text data but unfortunately for color I can't while they should be the same, below there is an image of app.To work we should provide a Binding for PreAddTextField .Thanks for your help
here is my Code :
import SwiftUI struct AddListView: View { @Binding var showAddListView : Bool @ObservedObject var appState : AppState @StateObject private var viewModel = AddListViewViewModel() var body: some View { ZStack { Title(addItem: { viewModel.textItemsToAdd.append(.init(text: "", color: .purple)) }) VStack { ScrollView { ForEach(viewModel.textItemsToAdd, id: \.id) { item in //note this is id: \.id and not \.self PreAddTextField(textInTextField: viewModel.bindingForId(id: item.id), colorPickerColor: <#Binding<Color>#>) } } } .padding() .offset(y: 40) Buttons(showAddListView: $showAddListView, save: { viewModel.saveToAppState(appState: appState) }) } .frame(width: 300, height: 200) .background(Color.white) .shadow(color: Color.black.opacity(0.3), radius: 10, x: 0, y: 10) } } struct SwiftUIView_Previews: PreviewProvider { static var previews: some View { AddListView(showAddListView: .constant(false),appState: AppState()) } } struct PreAddTextField: View { @Binding var textInTextField : String @Binding var colorPickerColor : Color var body: some View { HStack { TextField("Enter text", text: $textInTextField) ColorPicker("", selection: $colorPickerColor) } } } struct Buttons: View { @Binding var showAddListView : Bool var save : () -> Void var body: some View { VStack { HStack(spacing:100) { Button(action: { showAddListView = false}) { Text("Cancel") } Button(action: { showAddListView = false // What should happen here to add Text to List??? save() }) { Text("Add") } } } .offset(y: 70) } } struct Title: View { var addItem : () -> Void var body: some View { VStack { HStack { Text("Add Text to list") .font(.title2) Spacer() Button(action: { addItem() }) { Image(systemName: "plus") .font(.title2) } } .padding() Spacer() } } }
DataModel :
import SwiftUI struct Text1 : Identifiable , Hashable{ var id = UUID() var text : String var color : Color } class AppState : ObservableObject { @Published var textData : [Text1] = [.init(text: "Item 1", color: .purple),.init(text: "Item 2", color: .purple)] } class AddListViewViewModel : ObservableObject { @Published var textItemsToAdd : [Text1] = [.init(text: "", color: .purple)] //start with one empty item //save all of the new items -- don't save anything that is empty func saveToAppState(appState: AppState) { appState.textData.append(contentsOf: textItemsToAdd.filter { !$0.text.isEmpty }) } //these Bindings get used for the TextFields -- they're attached to the item IDs func bindingForId(id: UUID) -> Binding<String> { .init { () -> String in self.textItemsToAdd.first(where: { $0.id == id })?.text ?? "" } set: { (newValue) in self.textItemsToAdd = self.textItemsToAdd.map { guard $0.id == id else { return $0 } return .init(id: id, text: newValue, color: .purple) } } } }
and finaly :
import SwiftUI struct ListView: View { @StateObject var appState = AppState() //store the AppState here @State private var showAddListView = false var body: some View { NavigationView { VStack { ZStack { List(appState.textData, id : \.self){ text in HStack { Image(systemName: "square") .foregroundColor(text.color) Text(text.text) } } if showAddListView { AddListView(showAddListView: $showAddListView, appState: appState) .offset(y:-100) } } } .navigationTitle("List") .navigationBarItems(trailing: Button(action: {showAddListView = true}) { Image(systemName: "plus") .font(.title2) } ) } } } struct ContentView_Previews: PreviewProvider { static var previews: some View { ListView() } }
-
How can I respond to files from a Share Extension in an app using a SwiftUI 2.0 lifecycle?
I'm building an app using the SwiftUI 2.0 lifecycle on iOS14.
I haven't used them before but I've added a Share Extension target to my project & have configured it to only open the specific types of files I want to be able to open in my app.
When opening the share sheet on my file type, my app shows as expected and my app is opened when tapping on it.
I'm now trying to work out how to handle those incoming files?
I've added an appDelegate adaptor to my App struct and know I can use:
func application(_ application: UIApplication, performActionFor shortcutItem: UIApplicationShortcutItem, completionHandler: @escaping (Bool) -> Void)
to handle Home Screen quick actions and:
func application(_ application: UIApplication, handlerFor intent: INIntent)
to handle Shortcut/Siri intents but I can't find an appropriate method for handling incoming files from the Share Extension.
I've also looked at the new
onOpenUrl
andonContinueUserActivity
instance methods but no dice either.It seems as thought in UIKit you have an
extensionContext
withinputItems
in a ViewController but I can't find anything similar with SwiftUI.Can anyone point me in the right direction please? Thank you!
Edit: It appears that I do get one URL from
onOpenUrl
from the incoming files. Haven't worked out multiple files yet. -
Generic parameter 'V' could not be inferred
I get the error: "Generic parameter 'V' could not be inferred" and "Value of type '(_) -> some View' has no subscripts" for the last line of the PickerView code below. Appreciate any assistance. Thanks.
struct HITScore: View { @State private var selectedCountDrop = 0 var CountDrop = [ "Platelet Count >50% fall and nadir >= 20K", "Platelet Count 30-50% fall OR nadir 10-19K", "Platelet Count <30% fall or nadir < 10K"] var body: some View { Form { Picker(selection: $selectedCountDrop, label: Text("Platelet count")){ ForEach(0..<CountDrop.count){ index in Text(self.CountDrop[index]).tag[index] } } } }
}
-
get list of cameras
I am wondering how to get list of cameras suitable for showing in the app(for example in an alert controller). For now I get the array cameras with
AVCaptureDeviceDiscoverySession
by doing the following:AVCaptureDeviceDiscoverySession *session = [AVCaptureDeviceDiscoverySession discoverySessionWithDeviceTypes:@[AVCaptureDeviceTypeBuiltInWideAngleCamera, AVCaptureDeviceTypeBuiltInTelephotoCamera] mediaType:AVMediaTypeVideo position:AVCaptureDevicePositionUnspecified]; NSArray *captureDevices = [session devices];
The problem is that NSArray contains excess info:
"<AVCaptureFigVideoDevice: 0x10ff0bb80 [Back Camera][com.apple.avfoundation.avcapturedevice.built-in_video:0]>", "<AVCaptureFigVideoDevice: 0x10ff107f0 [Front Camera][com.apple.avfoundation.avcapturedevice.built-in_video:1]>"
But all I need is "back camera" and "front camera". Thank you in advance!
-
Record camera output before Vision recognises event
My app recognises an event using Vision and uses CMSampleBuffer to do so. After the event I am recording the video already using AVWriter successfully.
Now I want to record the full motion and thus record 1-2 seconds before the event occurred.
I tried pushing the
CMSampleBuffer
into a ring buffer, but that starves the camera of buffers.func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { // sends that to detectBall
/// Gets called by the camera every time there is a new buffer available func detectBall(inBuffer buffer: CMSampleBuffer, ballDetectionRequest: VNCoreMLRequest, orientation: CGImagePropertyOrientation, frame: NormalizedPoint, updatingRingBuffer: PassthroughSubject<AppEnvironment.AVState.RingBufferItem, Never> ) throws { // I tried to convert it into a CVPixelBuffer but its a shallow copy as well so it also starves the camera let imageBuffer: CVPixelBuffer = CMSampleBufferGetImageBuffer(buffer)! /// rotated 90 because of the cameras native landscape orientation let visionHandler = VNImageRequestHandler(ciImage: croppedImage, options: [:]) try visionHandler.perform([ballDetectionRequest]) if let results = ballDetectionRequest as? [VNClassificationObservation] { // Filter out classification results with low confidence let filteredResults = results.filter { $0.confidence > 0.9 } guard let topResult = results.first, topResult.confidence > 0.9 else { return } // print(" its a: \(topResult.identifier)") // print("copy buffer") updatingRingBuffer.send(AppEnvironment.AVState.RingBufferItem( /// HERE IS THE PROBLEM: AS SOON AS I SEND IT SOMEWHERE ELSE THE CAMERA IS STARVED buffer: imageBuffer, ball: topResult.identifier == "ball")
How can I achieve to store these 1-2 seconds of video continuously without writing it to disk and then prepending it to the video file?
Thanks!
-
How to get AVFileType from AVAsset or from URL
I have been searching online and I have found some solutions but not quite the one I need.
I basically need to be able to get an instance of AVFileType (mp4, mov ...) from either AVAsset or from URL.
I have seen that AVURLAsset has a class variable returning an array of available ones but not for the asset itself.
Is that possible?
Thank you
-
AudioEngine to CMSampleBuffer
I am using AVAudioEngine to capture audio from macOS microphone and some other sources to mix multiple audio inputs, since the AVAudioCaptureSession does not have a feature to mix multiple inputs. Therefore, I am trying to convert the AVAudioPCMBuffer which I can obtain by installing a tap to the node of audioengine and convert it back to CMSampleBuffer. This is what I got so far.
engine.mainMixerNode.installTap(onBus: 0, bufferSize: 4096, format: format) { (buffer, time) in var sbuf : CMSampleBuffer? var timing = CMSampleTimingInfo(duration: CMTimeMake(value: 4096, timescale: 44100), presentationTimeStamp: .zero, decodeTimeStamp: .invalid) let code = CMSampleBufferCreate(allocator: kCFAllocatorDefault, dataBuffer: nil, dataReady: false, makeDataReadyCallback: nil, refcon: nil, formatDescription: buffer.format.formatDescription, sampleCount: CMItemCount(4096), sampleTimingEntryCount: 1, sampleTimingArray: &timing, sampleSizeEntryCount: 0, sampleSizeArray: nil, sampleBufferOut: &sbuf) print(code) let status = CMSampleBufferSetDataBufferFromAudioBufferList(sbuf!, blockBufferAllocator: kCFAllocatorDefault, blockBufferMemoryAllocator: kCFAllocatorDefault, flags: 0, bufferList: buffer.audioBufferList) print(status) }
However, when I do print(status) it prints -12731, and the databuffer is not initialized.
I am not sure what I did wrong.
-
AVAudioEngine with AVAudioPlayerNode does not play any sound
I'm working with a 3rd party C++ library for audio processing, but for starters, I can't even get a simple audio engine to play. This is my AudioEngine.mm file, from which I can call my startAVEngine method from my main ViewController written in Swift by calling AudioEngine().startAVEngine():
#import <AudioToolbox/AudioToolbox.h> #import <AVFoundation/AVFoundation.h> #import <UIKit/UIKit.h> #import "AudioEngine.h" @interface AudioEngine () {} @property (nonatomic, strong) AVAudioPlayerNode *audioPlayerNode; @property (nonatomic, strong) AVAudioFile *audioFile; @property (nonatomic, strong) AVAudioPCMBuffer *audioPCMBuffer; @property (nonatomic) AVAudioEngine *avEngine; @end @implementation AudioEngine -(AVAudioEngine*) avEngine{ if(!_avEngine){ _avEngine = [[AVAudioEngine alloc] init]; } return _avEngine; } -(void) startAVEngine{ NSError *fileerror; NSString *soundFilePath = [[NSBundle mainBundle] pathForResource:@"SampleMonoForDemo" ofType:@"wav"]; NSURL* soundFileURL = [NSURL fileURLWithPath:soundFilePath]; self.audioFile = [[AVAudioFile alloc] initForReading:soundFileURL error:&fileerror]; NSLog(@"error:%@", fileerror); AVAudioFormat *audioFormat = self.audioFile.processingFormat; AVAudioFrameCount length = (AVAudioFrameCount)self.audioFile.length; self.audioPCMBuffer = [[AVAudioPCMBuffer alloc]initWithPCMFormat:audioFormat frameCapacity:length]; [self.audioFile readIntoBuffer:self.audioPCMBuffer error:nil]; self.audioPlayerNode = [[AVAudioPlayerNode alloc] init]; [[self avEngine] attachNode:self.audioPlayerNode]; // Connect Nodes AVAudioMixerNode *mixerNode = [[self avEngine] mainMixerNode]; AVAudioOutputNode *outputNode = [[self avEngine] outputNode]; [[self avEngine] connect:self.audioPlayerNode to:mixerNode format:self.audioFile.processingFormat]; [[self avEngine] connect:mixerNode to:outputNode format:self.audioFile.processingFormat]; // Start the engine. NSError* error; [[self avEngine] prepare]; [[self avEngine] startAndReturnError:&error]; if (error) { NSLog(@"error:%@", error); } NSLog(@"Engine description: %@", [[self avEngine] description]); if ([[self avEngine] isRunning]) { printf("Engine running\n"); } [self.audioPlayerNode scheduleBuffer:self.audioPCMBuffer atTime:nil options:AVAudioPlayerNodeBufferLoops completionHandler:nil]; [self.audioPlayerNode play]; if ([self.audioPlayerNode isPlaying]) { printf("Playing file\n"); } } @end
And this is my AudioEngine.h file:
#import <AVFoundation/AVFoundation.h> NS_ASSUME_NONNULL_BEGIN @interface AudioEngine : NSObject -(void) startAVEngine; @end NS_ASSUME_NONNULL_END
This is my output, which seems to work since I'm seeing the printed messages and the graph description, although no sound is produced both with speakers and headphones. I've ensured that the audio file is actually part of the bundle resources, and I have no problems playing the sound with the SCNAudioPlayer, so I figure the problem must reside within the AVAudioEngine setup. I've tried both working with/without audioPCMBuffer.
I've looked in every thread I could find on the issue, and I'm suspecting that it could be a global/local issue, since this is the first time I'm coding with Objective-C. Could somebody confirm if this is the case and provide some guidance on how to fix it? Or is it another issue?
2021-04-08 12:12:56.005446+0200 Audio Rehab[876:114526] Metal GPU Frame Capture Enabled 2021-04-08 12:12:56.005564+0200 Audio Rehab[876:114526] Metal API Validation Enabled 2021-04-08 12:12:56.236392+0200 Audio Rehab[876:114526] error:(null) 2021-04-08 12:12:56.480065+0200 Audio Rehab[876:114526] Engine description: ________ GraphDescription ________ AVAudioEngineGraph 0x15892c5f0: initialized = 1, running = 1, number of nodes = 3 ******** output chain ******** node 0x280e7b100 {'auou' 'rioc' 'appl'}, 'I' inputs = 1 (bus0, en1) <- (bus0) 0x280e65800, {'aumx' 'mcmx' 'appl'}, [ 1 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] node 0x280e65800 {'aumx' 'mcmx' 'appl'}, 'I' inputs = 1 (bus0, en1) <- (bus0) 0x281c76800, {'augn' 'sspl' 'appl'}, [ 1 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] outputs = 1 (bus0, en1) -> (bus0) 0x280e7b100, {'auou' 'rioc' 'appl'}, [ 1 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] node 0x281c76800 {'augn' 'sspl' 'appl'}, 'I' outputs = 1 (bus0, en1) -> (bus0) 0x280e65800, {'aumx' 'mcmx' 'appl'}, [ 1 ch, 44100 Hz, 'lpcm' (0x00000029) 32-bit little-endian float, deinterleaved] ______________________________________ Engine running Playing file
-
Why does the second view controller display incorrectly in a Playgrounds environment?
I'm encountering an issue with Swift Playgrounds that prohibits me from continuing forward with progress for a project. I notice that when
SecondViewController
is presented, it appears larger before; the constraints it had (which were similar to the previous view controller) did not reflect similar toIntroductionViewController
, the first view controller.This project is created on the Playgrounds app on macOS, not Xcode.
The issue
The two images below show the two view controllers:
IntroductionViewController
andSecondViewController
respectively. Notice how theUIStackView
(with a blue background) on both screens do not share the same horizontal constraints.The code
Here's the code for
IntroductionViewController
. To cut down, only the essential bits of code which I suspect may have contributed to the issue are shown:import UIKit private protocol IntroductionDelegate { func beginButtonTapped() } private class IntroductionView: UIView { lazy var dotView: UIView = { let dotView = UIView() dotView.backgroundColor = .label dotView.translatesAutoresizingMaskIntoConstraints = false return dotView }() // [Redacted code]: titleLabel, subtitleLabel, beginButton lazy var stackView: UIStackView = { let stackView = UIStackView() stackView.alignment = .center stackView.distribution = .fillProportionally stackView.addSubview(dotView) stackView.addSubview(titleLabel) stackView.addSubview(subtitleLabel) stackView.addSubview(beginButton) stackView.backgroundColor = .blue stackView.translatesAutoresizingMaskIntoConstraints = false return stackView }() var delegate: IntroductionDelegate? // [Redacted code]: Required inits private func configureLayout() { NSLayoutConstraint.activate([ stackView.rightAnchor.constraint(equalTo: rightAnchor, constant: -100.0), stackView.leftAnchor.constraint(equalTo: leftAnchor, constant: 100.0), stackView.centerXAnchor.constraint(equalTo: centerXAnchor), stackView.centerYAnchor.constraint(equalTo: centerYAnchor), dotView.topAnchor.constraint(equalTo: stackView.topAnchor), dotView.centerXAnchor.constraint(equalTo: stackView.centerXAnchor), dotView.heightAnchor.constraint(equalTo: heightAnchor, multiplier: 0.1), dotView.widthAnchor.constraint(equalTo: dotView.heightAnchor), // [Redacted code] beginButton.topAnchor.constraint(equalTo: subtitleLabel.bottomAnchor, constant: 40.0), beginButton.rightAnchor.constraint(equalTo: stackView.rightAnchor), beginButton.bottomAnchor.constraint(equalTo: stackView.bottomAnchor), beginButton.leftAnchor.constraint(equalTo: stackView.leftAnchor), beginButton.heightAnchor.constraint(equalToConstant: 65.0) ]) } @objc func begin() { beginButton.isEnabled = false UIView.animate(withDuration: 1.5, delay: 0.0, options: .curveEaseInOut, animations: { // [Redacted code]: Animations }, completion: { (nil) in NSLayoutConstraint.activate([ self.dotView.bottomAnchor.constraint(equalTo: self.stackView.bottomAnchor) ]) UIView.animate(withDuration: 0.75, delay: 0.5, options: .curveEaseInOut, animations: { // [Redacted code]: Animations }, completion: { (nil) in self.delegate?.beginButtonTapped() }) }) } } public class IntroductionViewController: UIViewController, IntroductionDelegate { func beginButtonTapped() { let secondVC = SecondViewController() secondVC.modalPresentationStyle = .fullScreen print(view.frame.size) secondVC.screenSize = view.frame.size self.present(secondVC, animated: false, completion: nil) } public override func loadView() { let view = IntroductionView() view.delegate = self self.view = view } }
Here is the code for
SecondViewController
as well:import UIKit class SecondView: UIView { lazy var dotView: UIView = { let dotView = UIView() dotView.backgroundColor = .label dotView.translatesAutoresizingMaskIntoConstraints = false return dotView }() lazy var stackView: UIStackView = { let stackView = UIStackView() stackView.alignment = .center stackView.distribution = .fillProportionally stackView.addSubview(dotView) stackView.backgroundColor = .blue stackView.translatesAutoresizingMaskIntoConstraints = false return stackView }() // [Redacted code]: Required inits private func configureLayout() { NSLayoutConstraint.activate([ stackView.rightAnchor.constraint(equalTo: rightAnchor, constant: -100.0), stackView.leftAnchor.constraint(equalTo: leftAnchor, constant: 100.0), stackView.centerXAnchor.constraint(equalTo: centerXAnchor), stackView.centerYAnchor.constraint(equalTo: centerYAnchor), dotView.topAnchor.constraint(equalTo: stackView.topAnchor), dotView.bottomAnchor.constraint(equalTo: stackView.bottomAnchor), dotView.centerXAnchor.constraint(equalTo: stackView.centerXAnchor), dotView.heightAnchor.constraint(equalTo: heightAnchor, multiplier: 0.1), dotView.widthAnchor.constraint(equalTo: dotView.heightAnchor), ]) } } public class SecondViewController: UIViewController { public var screenSize: CGSize? public override func loadView() { let view = SecondView() view.frame.size = screenSize! print(view.frame.size) self.view = view } }
The attempts
I included two
print()
statements to get the size of the view's frame, just to see if the view was resized correctly. Sure enough, I got the following in the console:(778.5, 1055.5) (778.5, 1055.5)
Move the
print()
statements to thelayoutSubviews()
function on each VC (redacted in the examples, but only contains code setting the corner radius ofdotView
andbeginButton
) unveiled new values that weren't shown previously:(701.0, 805.0) (701.0, 805.0) (778.5, 1055.5) (778.5, 1055.5) (778.5, 1055.5) (778.5714285714286, 1055.844155844156) (778.5714285714286, 1055.844155844156) (779.0, 1056.0) (599.5, 380.5) [console opened?]
I assume that the
layoutSubviews()
is dynamic and changes actively as the Live View is being reshaped. Though, I cannot determine why the Live View would reflect such striking differences when this only has an extremely minute difference with the frame size set.That's all that I know to do, though. What has gone wrong here, and how can it be resolved?
-
@State is not working on iPad Swift Playgrounds
I'm trying to use SwiftUI on my iPad with Swift Playgrounds. The view below renders fine initially, but unfortunately the view does not update when
@State
changes, like it does on my Mac. In the little sidebar I can see the closure does get executed though...I'm using the newest non-beta version of everything.
import SwiftUI import PlaygroundSupport struct ContentView: View { @State private var tapCount = 0 var body: some View { Button("Tap count: \(tapCount)") { tapCount += 1 } } } PlaygroundPage.current.setLiveView(ContentView())
Thanks.