It's also written by Arsian @QuadDamagedIf you’re looking for something with more options, the MyNoise app is pretty nice. Has a bunch more sound choices than just simple white noise, but the white noise generator itself is free and works well.
It's also written by Arsian @QuadDamagedIf you’re looking for something with more options, the MyNoise app is pretty nice. Has a bunch more sound choices than just simple white noise, but the white noise generator itself is free and works well.
I have been running their rain and cat purring sound generators all day – came in handy in the 33℃ heat here ;-)It's also written by Arsian @QuadDamaged
Woah, TIL, that's awesome!It's also written by Arsian @QuadDamaged
That's a bit... nuclear, nuking the cloud daemon like that (even if it works for you). You may want to try Howard's suggestions or tools first.I occasionally have an issue where I'll save something to my iCloud Drive on my phone or iPad, but it doesn't show up on my Mac because it has decided to stop syncing iCloud Drive, and the only way I could figure out how to restart it was a reboot, which is annoying. Today I discovered that "killall bird" in a terminal unclogs whatever wasn't working, and my iCloud Drive instantly synced!
killall bird
by itself (without a -9
) is going to send a SIGTERM; that's a perfectly fine way to cleanly kill a process (and is how launchd is going to terminate it if you reboot or stop the service with launchctl).Ah, sorry. I mixed them up, my bad. Should teach me not to post before the morning coffee...Justkillall bird
by itself (without a-9
) is going to send a SIGTERM; that's a perfectly fine way to cleanly kill a process (and is how launchd is going to terminate it if you reboot or stop the service with launchctl).
Don't send SIGQUIT; that will usually trigger a core dump and make the crash reporter dialog pop up, and whatever you're terminating might not clean up after itself (or may not clean up as completely).
I'm assuming that this is only for cellular watches that have been provisioned, right? I have a cellular Series 6 (A2294), but I never provisioned it, and I don't see a phone number anywhere in the Settings.Very long story short, my CITI AA credit card likes to cry fraud when I use Apple Pay for select, annoying, transactions.
You have to call them so they can text you a code to verify it’s you, but not to your phone number on file, some other number you have to give them.
I realized this morning my Apple Watch Ultra has a unique phone number. And yes, you can text it directly!
Not sure if it's just a unique number, but in regards to uniqueness when it comes to credit cards, every card you add to apple wallet has a unique number between physical card, iPhone and Apple Watch, so if you have to initiate a return and they require to present the payment method, you might have to try all three if you don't remember when one you used! </derail>I'm assuming that this is only for cellular watches that have been provisioned, right? I have a cellular Series 6 (A2294), but I never provisioned it, and I don't see a phone number anywhere in the Settings.
I'm assuming that this is only for cellular watches that have been provisioned, right? I have a cellular Series 6 (A2294), but I never provisioned it, and I don't see a phone number anywhere in the Settings.
This is how it's always been, and omg yes, it's better than hunting for the done button.BONUS: if you are moving icons around between folders and homescreen, you can swipe up to turn off jiggle mode instead of clicking on the done button in the upper right. Seems trivial but it's SO much better, particularly if you're moving a bunch of stuff around.
Before this for me it was the home button, and then when the home button went away I thought you had to use the done button ;_;This is how it's always been, and omg yes, it's better than hunting for the done button.
Darn, you’re half-right: it’s not exclusive to the iPhone 15, but requires an iPhone 13 or later. I got excited thinking that it would work on my iPhone 12 Pro, but no dice.Before this for me it was the home button, and then when the home button went away I thought you had to use the done button ;_;
Edit more bonsu: (also are iOS tricks welcome, before I post others?) Apple showed off retroactively changing the focal point using the new camera on the iPhone 15, but this works all the way back to iPhone 12 (I think) in iOS 17. You can pop open an old portrait photo and tap different parts of the scene to change the focus - not just the focal length, but what the image subject is - it’s pretty snazzy!
Works just about everywhere except MS Word, which often (but not always) insists that it knows better than you what you want to do, and often (but not always) preserves some (but not all) of the original formatting. The workaround is Edit: Paste Special: Unformatted text (not available from Right-Click: Paste Special , as this has a greater likelihood (but no guarantee) of stripping all original formatting.When capturing text from several sources to paste into one document, I found it annoying to have different sizs/style texts showing up and later having to go and make everything my preferred font and size. HOWEVER, there is a menu item -- "Edit" > "Paste and match style" -- which makes the pasted text look like I just typed it! The keystroke is -- SHIFT-CONTROL-CMD-v
Not actually hidden, but often overlooked.
These extra steps can be circumvented as follows. Paste your text into Word, then click the little clipboard icon that appears just after the pasted content:Works just about everywhere except MS Word, which often (but not always) insists that it knows better than you what you want to do, and often (but not always) preserves some (but not all) of the original formatting. The workaround is Edit: Paste Special: Unformatted text (not available from Right-Click: Paste Special , as this has a greater likelihood (but no guarantee) of stripping all original formatting.
The ultimate workaround is to paste into a plain-text-only app like Text Edit and then copy/paste into the actual destination.
These extra steps can be circumvented as follows. Paste your text into Word, then click the little clipboard icon that appears just after the pasted content:
View attachment 65303
Choose ”Keep Text Only” for great victory.
Interesting. I'll have to try this and see if it worksThese extra steps can be circumvented as follows. Paste your text into Word, then click the little clipboard icon that appears just after the pasted content:
View attachment 65303
Choose ”Keep Text Only” for great victory.
It works as advertised. I use it all day, every day.Interesting. I'll have to try this and see if it worksas expected. "Do X. No, really, do X. Yes, I'm sure."
Can't wait for a 0‑day of thatI'm not 100% sure this is the right thread, but it is both cool and very stupid: Flappy Bird implemented in Finder.
All those years?!? Wasn't that a feature only added this fall with Safari 17?So did anyone else all these years never realize you could select and then drag/drop multiple tabs in a browser just by using command and shift like normal, but for tabs?? Just me?
I just checked, Firefox and Chromium at least look to have had it awhile ago. It works for me on an old system in those, and hell for all I know browsers like OmniWeb might have supported it like 15 years ago. I've used every single one of those browsers, and long gone stuff like Internet Explorer for Mac back when that was the best around, what was it, 10.1? 10.2? Gods when did Apple even launch Safari anyway? /me looks it up it was in 2003 with Panther and it feels like the whole KHTML drama and so on was much later in the 2000s. Anyway in all that time I somehow never thought to check that, or if I did only tried it once and if it didn't work never thought of it again.All those years?!? Wasn't that a feature only added this fall with Safari 17?
Yeah, I was shocked to see that Safari JUST added this, but I'm a Firefox evangelist and everything else is either banned from my system or a back up, so guess I can't be surprised that I missed when the feature hit Safari. I'm glad webkit exists, though. Fucking chromium... sheeeeeeit. sighI just checked, Firefox and Chromium at least look to have had it awhile ago. It works for me on an old system in those, and hell for all I know browsers like OmniWeb might have supported it like 15 years ago. I've used every single one of those browsers, and long gone stuff like Internet Explorer for Mac back when that was the best around, what was it, 10.1? 10.2? Gods when did Apple even launch Safari anyway? /me looks it up it was in 2003 with Panther and it feels like the whole KHTML drama and so on was much later in the 2000s. Anyway in all that time I somehow never thought to check that, or if I did only tried it once and if it didn't work never thought of it again.
News to me!WHAT IS THIS SORCERY.
I was at the grocery store listening to music on my 13 Pro via my APP2's. My wife and I were having a conversation via text with me dictating my replies to Siri and Siri speaking my wife's replies back to me. No looking at the screen. No big deal, do this all the time.
But then, my wife sends me a photo of something she's looking at in a store... and Siri described the photo to me. "Kathrine sent you a message with a photo of a gray sectional couch with a table beside it."
I know that iOS has been able to identify people, then objects, with more and more granularity (like how you can use the Photos.app to identify flowers, etc.) but connecting that ability to Siri to allow it to proactively identify objects in a photo that it has determined you cannot see at the moment is clever and useful. And maybe that connection has been there a while and I just never experienced it, but... still cool.
Seems like it's new in 17.2. I often have an AirPod in listening to a podcast, and I'm getting the picture description now as well.WHAT IS THIS SORCERY.
I was at the grocery store listening to music on my 13 Pro via my APP2's. My wife and I were having a conversation via text with me dictating my replies to Siri and Siri speaking my wife's replies back to me. No looking at the screen. No big deal, do this all the time.
But then, my wife sends me a photo of something she's looking at in a store... and Siri described the photo to me. "Kathrine sent you a message with a photo of a gray sectional couch with a table beside it."
I know that iOS has been able to identify people, then objects, with more and more granularity (like how you can use the Photos.app to identify flowers, etc.) but connecting that ability to Siri to allow it to proactively identify objects in a photo that it has determined you cannot see at the moment is clever and useful. And maybe that connection has been there a while and I just never experienced it, but... still cool.