WWDC, June 10-14, 2024, to be 'A(bsolutely) I(ncredible)' — Keynote Discussion Here!

stevenkan

Ars Legatus Legionis
15,662
Headline condensed from:


But there's no doubt that that Joswiak's choice of words is intentional.
 

Aleamapper

Ars Scholae Palatinae
1,284
Subscriptor
I’m actually pretty excited about this from a developer standpoint. The transition to Swift 6 is going to be a Big Deal™️ for a lot of folks and the improvements to the ownership model of Swift stand to create some substantial performance improvements.
I haven't been keeping up with swift evolution at all since last year. I know there's some rust-like ownership stuff being added, but is there much else that's exciting? Actors and async/await from last year seemed like bigger deals.
 

ProMacUser

Ars Scholae Palatinae
979
I haven't been keeping up with swift evolution at all since last year. I know there's some rust-like ownership stuff being added, but is there much else that's exciting? Actors and async/await from last year seemed like bigger deals.
Actors are going to be more strictly regulated. Basically, you're not going to be able to cross an Actor boundary anymore unless the object is guaranteed to be immutable or otherwise thread safe (Sendable). And it's a compile-time check. The concept isn't new, but it is going to be a lot more strict, and I think that's pretty notable.
 
  • Like
Reactions: Aleamapper

wrylachlan

Ars Legatus Legionis
12,769
Subscriptor
There’s some custom executor functionality and various small holes in the data race safety model have been filled. The Variadics model has been usefully extended to loops (I think this will be a big one for the use of Variadics in SwiftUI). And the ownership model has the potential to substantially speed up some algorithms where copying is the bottleneck. This stands to improve the Swift ecosystem even if you don’t directly interact with it as packages start to use it for their internal implementations.

Mostly I think the strict data safety checks are what will get package developers off the fence and doing more with async. So while there isn’t a ton of new functionality there my guess is that we’ll see an explosion of async use. I think this will be particularly true of Apple APIs. We’re going to see a lot of Apple APIs making use of both async and the ownership model.
 
  • Like
Reactions: Aleamapper

kenada

Ars Legatus Legionis
17,112
Subscriptor
It’s wishful thinking, but I’d like to see Apple be a better open source citizen. They run or participate in a few projects, but they mostly release source as source drops. There’s no way to contribute, and the releases often require (sometimes a lot of) extra work to build.

(This comment brought to you by my trying to update libiconv in nixipkgs and wondering why it causes libarchive’s tests to break.)
 

wrylachlan

Ars Legatus Legionis
12,769
Subscriptor
Unless Apple makes those changes back-deployable, it’ll be a while before developers can really take advantage of them. It’s one of the limitations of shipping Swift with the OS and not providing updated runtimes for older versions.
Ish. Much (most?) of Swift 6 is just moving Sendable checking from a warning to an error. The bulk of the async await features have been phased in over the past couple years and the ecosystem is very familiar with them.

It’s the first time in a number of years where they’re implementing source-breaking changes so it will be interesting to see how that goes.
 
  • Like
Reactions: Nugget

kenada

Ars Legatus Legionis
17,112
Subscriptor
Ish. Much (most?) of Swift 6 is just moving Sendable checking from a warning to an error. The bulk of the async await features have been phased in over the past couple years and the ecosystem is very familiar with them.
The syntactic changes should be back-deployable. It’s the runtime changes that are uncertain or unlikely. For example, if Swift 6 requires certain properties of the runtime for its strict concurrency model to work, then the Apple OS versions one can target may be limited.

It’s the first time in a number of years where they’re implementing source-breaking changes so it will be interesting to see how that goes.
As long as the modules you use can target different language versions, it should be transparent. I doubt Apple will make the mistake Python did with Python 3. The Swift 4 to 5 transition went pretty smoothly IIRC.
 

wrylachlan

Ars Legatus Legionis
12,769
Subscriptor
As long as the modules you use can target different language versions, it should be transparent. I doubt Apple will make the mistake Python did with Python 3. The Swift 4 to 5 transition went pretty smoothly IIRC.
I wasn’t thinking so much ‘smooth vs problematic’ so much as ‘slow uptake vs fast uptake’.
 
  • Like
Reactions: kenada

kenada

Ars Legatus Legionis
17,112
Subscriptor
I wasn’t thinking so much ‘smooth vs problematic’ so much as ‘slow uptake vs fast uptake’.
Probably several years if not longer. If Swift 5 continues to work and be supported, companies might stick with what works and makes money for them. I expect early adopters to be open-source projects, small code bases, and companies who adopt new technologies right away. For your electric company’s app? Assuming they’re not using some cross-platform framework already, probably not anytime soon.
 

cateye

Ars Legatus Legionis
11,760
Moderator
Thought I'd amend the title to this thread and give it a bump back up to the top of the list now that we're ~24 hours away from the keynote, particularly since in the hours before keynotes is when we tend to see a rush of last-minute rumors and leaks.

AI / Apple Intelligence, iOS/iPadOS 18, MacOS 15 (funny how there's been basically no rumors at all about MacOS)... VisionOS 2.0, maybe? MacRumors has a nice soup-to-nuts summary up of everything we're likely to learn about.
 

Jeff3F

Ars Tribunus Angusticlavius
6,826
Subscriptor++
My biggest objection to Siri isn’t that it’s bad (it is), but when it fails and I say something spicy, Siri either reprimands me or makes a bad joke. I would prefer a speech or verbal user interface to detect frustration and use that as a quick way to undo or redo things. Keywords like “shit” or ”oops” or ”damn”, or anguished noises should prompt the system to recognize that it wasn’t correct…it should not chide/manipulate a surly user or ignore the situation with a joke.
 

Bonusround

Ars Scholae Palatinae
1,060
Subscriptor
A thought in advance of the keynote: I’ll be surprised if Apple’s AI compatibility story is as simple as rumored, requiring an M1 or A17 Pro or greater – specifically on the iPhone and iPad side of the house.

I think – and hope – we are more likely to see a complex, feature-by-feature compatibility matrix across older models. Global, system-wide features, like the rumored revamped Siri, seem far more likely to have limited availability. But I find it difficult to imagine why an iPhone 13 Pro or 14 Pro would need to be excluded from app-centered functionality such as, say, AI-driven enhancements or retouching in Photos.
 
  • Like
Reactions: macosandlinux

ant1pathy

Ars Tribunus Angusticlavius
6,461
Is there any chance in this world that Apple will announce a hardware product related to all of this mania? That would be truly innovative, beyond the mainstream concepts, buzzwords.

— isn’t the AirPod speaker approach supposed to be smart Siri in a box? That hasn’t gone very well.
What hardware product would you possibly envision? Homepod with a screen?
 

wrylachlan

Ars Legatus Legionis
12,769
Subscriptor
But I find it difficult to imagine why an iPhone 13 Pro or 14 Pro would need to be excluded from app-centered functionality such as, say, AI-driven enhancements or retouching in Photos.
I think the limiting factor is likely onboard RAM. Big generative models use a lot of memory - even with every trick that Apple is working on to mitigate those limitations. The stuff they off-board to cloud can run on anything, but the privacy-forward on-device features will likely hit the RAM hard.

I wonder if this will be the kick in the pants that pushes Apple to substantially increase the RAM in their devices. For the longest time skimping on RAM both padded the bottom line and ensured that devices aren’t too energy hungry (you do need to power on RAM after all). But RAM is getting ever more energy efficient and ML is creating a meaningful use case for lots and lots of RAM. Apple could potentially throw their cash horde around a bit to Hoover up way more RAM than they’re using now.
 

Bonusround

Ars Scholae Palatinae
1,060
Subscriptor
I think the limiting factor is likely onboard RAM. Big generative models use a lot of memory - even with every trick that Apple is working on to mitigate those limitations. The stuff they off-board to cloud can run on anything, but the privacy-forward on-device features will likely hit the RAM hard.
Yes. And for a global, always-available service like Siri I can imagine the potential for severe RAM pressure.

App-level features aren’t the same. An Apple-supplied framework, linked by a specific app, can be unloaded at a moment’s notice or evicted with the app itself.

Draw Things is a Stable Diffusion front-end. It runs on my 6GB iPad without complaint. It can even do lower-res imagery on a 4GB 12 mini.

As to the call for more base RAM, hear hear. No MacBook with < 12GB by 2025.
 

wrylachlan

Ars Legatus Legionis
12,769
Subscriptor
App-level features aren’t the same. An Apple-supplied framework, linked by a specific app, can be unloaded at a moment’s notice or evicted with the app itself.
If Apple wants AI to be a big part of their offering, they don’t want Apps (or even just the models they rely on) evicted from memory the second you multi-task away. If I’m cycling back and forth through 3-4 apps that all use ML I don’t want the hiccup of reloading the whole thing every time I switch.

So I imagine that the most important core models would be expected to stay memory resident through task switching.

Any way you slice it RAM is going to be a limiting factor. Were moving from a modality where a couple hundred lines of code add a new feature to one where you add hundreds of megabytes or even a few gigabytes of model size to add a new feature. Thats a fundamentally different proposition and will likely require a step change in RAM to really leverage the benefits.
 

cateye

Ars Legatus Legionis
11,760
Moderator
Part of the rumor about "Apple Intelligence" indicated there was going to be some kind of algorithm that determined whether a request would be processed on-device or sent to the cloud. Perhaps memory pressure/availability is one of the metrics that goes into that decision making? While I'm on-board as much as anyone for Apple to stop being so impossibly stingy with RAM, that really seems to be a hill they want to die on for whatever few sheckles of profit it guarantees. So I could see them trying to engineer around that limitation as much as possible.
 

Honeybog

Ars Scholae Palatinae
2,075
In non-AI news, MacRumors is reporting that WatchOS will drop support for the 4, 5, and 2020 SE. That’s a huge bummer if so. 4 (and 5) years is a pretty short support window, and I really don’t like dropping multiple generations at the same time, since it makes it harder to plan future purchases.

Also kind of a bummer because I’d be happy to upgrade my 5 if Apple could give me literally any reason that wasn’t the now disabled(?) pulse ox.
 
  • Hug
Reactions: Jeff3F

Jeff3F

Ars Tribunus Angusticlavius
6,826
Subscriptor++
I agree that sucks, but I also don’t find it as awful for the watch, as long as the core functionality continues and hopefully they continue to extract new/novel insights from the devices’ sensors (I was surprised to see that my iOS health app can show me info about my gait or speed of taking stairs for example).

I am a bit surprised that Apple seems to have gone with losing the oximetry feature altogether in already-released watches anyway. It’s probably not a huge deal for me practically, but I would still not like to have a feature go away. I never like it, if I ever used it (I was ok letting ‘Force Touch’ go).

And, as we all learned from Siri on the watches taking 6 hours, my fear is that using Siri-fied AI on phones will be take a long, long time. This will be very interesting to follow, and I’m wondering if it’s prudent to wait a year to see what/how they do and whether the 2025 iPhone is going to have improved hardware (RAM?) to cope with onboard generative AI, vs this years future models.

:)