Supporting Dark Mode: In-App Web Content
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. If you’re using a web view the way a browser does, to show arbitrary content from the web, then you probably don’t need to do anything special to accommodate Dark Mode. If Dark Mode takes off then maybe we’ll see some kind of CSS standard around presenting web sites for dark presentations, but for the time being users expect web pages to look … like web pages. On the other hand if you’re using web views to support an otherwise native Mac user interface, you’ll want to do something to adapt the default styling of your web content to look appropriate in Dark Mode. For example, I use a web view in my standard about box window. Here’s how it looked in MarsEdit before I adapted any web content to Dark Mode: The two-tone look is kind of cool, but too much of an assault on the eyes for anybody who has really settled into Dark Mode. This content is not like a web page. It’s implemented in HTML to make features such as styling, layout, and links easier to manage, but as far as users are concerned it’s an innate part of this native About Box window. I’ve seen a few approaches to adapting web content to Dark Mode, but most of them relied too heavily on modifying the actual HTML content that was being shown. In my apps, I use web views in several places and, rather than have to jump through hoops in each place to finesse the content for Dark Mode, I thought it would be better if I could come up with a common infrastructure that all web content presenters could use without typically being concerned about appearance. Because my app is using both legacy WebView and modern WKWebView, I had to duplicate my efforts to some extent, but for the purposes of this article I’m going to focus on the approach I took for WKWebView. My solution is rooted in arranging for a web view to call a pertinent JavaScript function when the effective appearance changes. Because WKWebView instances are notified via the viewDidChangeEffectiveAppearance method, I decided to subclass WKWebView to fulfill this contract on behalf of clients: class RSAppearanceSensitiveWKWebView: WKWebView { var didInitialize = false // Override designated initializers to record when we're // done initializing, and avoid evaluating JS until we're done. override init(frame: CGRect, configuration: WKWebViewConfiguration) { super.init(frame: frame, config
2 days ago
Supporting Dark Mode: Adapting Images
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. When Apple announced Dark Mode at WWDC 2018, one of my first thoughts was: “Aha! So that’s why Apple has us all switching our icons to template (monochrome) style images.” These images do tend to adapt pretty well to Dark Mode because the system can simply invert them and obtain a fairly usable icon, but the crude inversion doesn’t work well in all cases. There are a lot of details to really getting Dark Mode icons perfect, and Apple talks a lot about this in the Introducing Dark Mode WWDC session. Given my limited time and resources, I took a pragmatic approach to get things looking “good enough” so I could ship a cohesive project while I hope to continue working on refinements in the future. As with colors, asset catalogs can be a great aid in managing image variations for different appearances. Use them if you can, but bear in mind the caveats mentioned in the Adapting Colors section of this series. Most apps will probably require some refinement of toolbar icons, the row of images typically displayed at the top of some windows. In MarsEdit, I was in pretty good shape thanks to a recent overhaul for MarsEdit 4, in which Brad Ellis revised my toolbar icons. Many, but not all, of the images have a templated style aesthetic. Here’s MarsEdit’s main window in Light Mode: Let’s see what happens when just switch to Dark Mode without any special care for the icons: That’s … not so good! Even my vaguely template-style icons are not being treated as templates, so they render with their literal gray colors and look pretty bad in Dark Mode. I realized I could probably do some quick Acorn work and get the template-style icons into shape, but what about the ones with splashes of color? The pencil? Should it still be yellow in Dark Mode? I opted for a pragmatic, stop-gap solution. Without making any changes whatsoever to the graphics files, I worked some magic in code and came up with this: That’s … actually pretty good! So what’s the magic in code I alluded to? I created a custom subclass of NSButton that will optionally set template status on the button’s image only if we’re in Dark Mode. You can see that some of the icons I’ve left untouched, because I felt their colors fit well enough in both dark and light modes. Here’s my custom RSDarkModeAdaptingToolbarButton: class RSDarkModeAdaptingTo
2 days ago
Supporting Dark Mode: Appearance-Sensitive Preferences
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. One of the challenges I dealt with in MarsEdit was how to adapt my existing user-facing color preferences to Dark Mode: The default values and any previously saved user customizations would be pertinent only to Light Mode. I knew I wanted to save separate values for Dark Mode, but I worried about junking up my preferences panel with separate labels and controls for each mode. After playing around with some more complicated ideas, I settled on simplicity itself: I would register and save separate values for each user-facing preference, depending on whether the app is in Dark Mode right now. When a user switches modes, the visible color preferences in this panel change to the corresponding values for that mode. I reasoned that users are most likely to use these settings to fine-tune the appearance of the app as it appears now, and it would be intuitive to figure out if they wanted to customize the colors separately for each mode. How did I achieve this? I decided to bottleneck access to these preferences so that as far as any consumer of the preference is concerned, there is only one value. For example, any component of my app that needs to know the “text editing color” consults a single property “bodyTextColor” on my centralized preferences controller. This is what the supporting methods, along with the property accessor, look like: func color(forKey key: String, defaultColor: NSColor) -> NSColor { let defaults = UserDefaults.standard guard let color = defaults.rsColor(forKey: key) else { return defaultColor } return color }func setColor(color: NSColor, forKey key: String) { let defaults = UserDefaults.standard defaults.rsSetColor(color, forKey: key) }var bodyTextPreferenceKey: String { if NSApp.isDarkMode { return bodyTextColorForDarkModeKey } else { return bodyTextColorForLightModeKey } }@objc var bodyTextColor: NSColor { get { let key = self.bodyTextPreferenceKey return self.color(forKey: key, defaultColor: .textColor) } set { let key = self.bodyTextPreferenceKey self.setColor(color: newValue, forKey: key) self.sendNotification(.textColorPreferenceChanged) } } Because the getter and setter always consult internal properties to determine the current underlying defaults key, we always write to and read from the semantic value that the user expects. The only thing the user interface for my
2 days ago
Supporting Dark Mode: Adapting Colors
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. Given the dramatic visual differences between appearances, virtually every color in your app will need to be varied at drawing time to work well with the current appearance. Use Semantic Colors The good news is all of Apple’s built-in, semantically named colors are automatically adapted to draw with a suitable color for the current appearance. For example, if you call “NSColor.textColor.set()” in light mode, and draw a string, it will render dark text, whereas in dark mode it will render light text. Seek out areas in your app where you use hard-coded colors and determine whether a system-provided semantic color would work just as well as, or better than, the hard-coded value. Among the most common fixes in this area will be locating examples where NSColor.white is used to clear a background before drawing text, when NSColor.textBackgroundColor will do the job better, and automatically adapt itself to Dark Mode. Vary the Color at Drawing Time In scenarios where a truly custom color is required, you have a few options. If the color is hard-coded in a drawing method, you may be able to get away with simply querying NSAppearance.current from the point where the drawing occurs. For example, a custom NSView subclass that simply fills itself with a hard-coded color: override func draw(_ dirtyRect: NSRect) { super.draw(dirtyRect) // Custom yellow for light mode, custom purple for dark... let lightColor = NSColor(red:1, green:1, blue:0.8, alpha:1) let darkColor = NSColor(red:0.5, green:0.3, blue:0.6, alpha:1) if NSAppearance.current.isDarkMode { darkColor.setFill() } else { lightColor.setFill() } dirtyRect.fill() } Use Asset Catalogs Special cases like above are fine when you draw your own graphics, but what about views that the system frameworks draw for you? For example, because NSBox supports drawing a custom background color, you’re unlikely to implement a custom view exactly like the one above. Its functionality is redundant with what NSBox provides “for free.” But how can you ensure that NSBox will always draw in the right color for the current appearance? It only supports one “fillColor” property. A naive approach would involve paying attention to changes in appearance, and re-setting the fill color on NSBox every time. This would get the job done, but is more complicated and error-prone t
2 days ago
Supporting Dark Mode: Responding to Change
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. To support Dark Mode elegantly in your app, you need to not only initialize your user interface appropriately for the current appearance, but also be prepared to adapt on the fly if the user changes appearance after your interface is already visible on the screen. In situations where you use semantic colors or consult the current appearance at drawing time, there’s nothing else to be done. Since changing the appearance invalidates all views on the screen, your app will redraw correctly without any additional intervention. Other situations may demand more customized behavior. I’ll give a couple examples later in this series, but for now just take my word that you may run into such scenarios. If you have a custom view that needs to perform some action when its appearance changes, the most typical way to handle this is by implementing the “viewDidChangeEffectiveAppearance” method on NSView. This is called immediately after your view’s effectiveAppearance changes, so you can examine the new appearance, and update whatever internal state you need to before subsequent drawing or event handling methods are called: class CustomView: NSView { override func viewDidChangeEffectiveAppearance() { // Update appearance related state here... } } If you need to react to appearance changes in the context of an NSViewController, your best bet may be to use Key Value Observing (KVO) to observe changes to the “effectiveAppearance” property on the view controller’s view. This approach will set your view controller up to be notified around the same time the view itself would be. In some situations it’s more interesting to know whether, at a high level, the application’s appearance has shifted in a significant way. For this you can observe “effectiveAppearance” on the NSApplication.shared instance. There is no standard NotificationCenter notification for this, but I found it useful enough in my code bases that I added my own. By centralizing KVO observation of NSApp.effectiveAppearance, I can translate all such changes into NotificationCenter-based broadcasts that any component of my app can observe. To implement something like this, first declare an NSNotification.Name extension in some high-level class such as your application delegate: extension NSNotification.Name { public static let appAppearanceChanged
2 days ago
Supporting Dark Mode: Checking Appearances
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. As you adapt your app to support Dark Mode, you may run into situations where you need to determine, in code, what the active appearance is. This is most typical in custom views that vary their drawing based on the current appearance, but you may also need to test higher level appearance traits to determine, for example, whether the whole application is being run “in Dark Mode” or not. Current and Effective Appearances To work effectively with Dark Mode support, you must appreciate the distinction between a few key properties that are all of type NSAppearance. Take some time to read the documentation and watch the videos I referenced in Educational Resources, so you really understand them. Here is a capsule summary to use as reference in the context of these articles: appearance is a property of objects, such as NSApplication, NSWindow, and NSView, that implement the NSAppearanceCustomization protocol. Any such object can have an explicit appearance deliberately set on it, affecting the appearance of both itself and any objects that inherit appearance from it. As a rule, views inherit the appearance of their window, and windows inherit the appearance of the app. effectiveAppearance is a property of those same objects, taking into account the inheritance hierarchy and returning a suitable appearance in the likely event that no explicit value has been set on the object. NSAppearance.current or +[NSAppearance currentAppearance] is a class property of NSAppearance that describes the appearance that is currently in effect for the running thread. Practically speaking you can think of this property as an ephemeral drawing variable akin to the current fill color or stroke color. Its value impacts the manner in which drawing that is happening right now should be handled. Don’t confuse it with high-level user-facing options about which mode is set for the application as a whole. High-Level Appearance Traits As you modify your code to respect the current or effective appearance, you will probably need to make high level assessments like “is this appearance light or dark?” Because of the aforementioned complication that there are many types of NSAppearance, that they can be nested, etc., it’s not possible to simply compare the current appearance with a named appearance. Instead, you use a method on N
2 days ago
Supporting Dark Mode: Opting In
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. To run any app in Dark Mode, you must to be running macOS Mojave 10.14 or greater. By default, all existing apps will run in Light Mode even if the system is configured to run in Dark Mode. An app that is launched on macOS Mojave will run in Dark Mode when two criteria are met: The system considers the app to be compatible with Dark Mode The running application’s appearance is set to Dark Aqua An app’s compatibility with Dark Mode is determined by a combination of the SDK it was built against, and the value of the “NSRequiresAquaSystemAppearance” Info.plist key. If your app is built against the 10.14 SDK or later, it will be considered compatible unless the key is set to NO. If your app is built against the 10.13 SDK or earlier, it is considered incompatible unless the Info.plist key is set to YES. When a compatible app is launched, its appearance is set to match the user’s system-wide preference, as selected in the System Preferences’s “General” tab. To streamline development, Apple also provides a switch in Xcode itself so that the appearance of a running app can be switched on the fly without affecting the appearance of other apps running on your Mac. Although the Xcode switch is handy for making quick comparisons between modes, there is not, as far as I know, any mechanism to always launch an app from Xcode in Dark Mode when the system is in Light Mode, or vice-versa. If you strongly prefer one mode over the other, you may want to build in affordances to your app that support debugging in “the other mode” when you need to. For example, in the build settings for your app, find “Other Swift Flags,” and add “-DDEBUG_DARK_AQUA”: Then, somewhere early in your app’s launch, you can conditionally force the appearance if specified: func applicationDidFinishLaunching(_ notification: Notification) { #if DEBUG_DARK_AQUA NSApp.appearance = NSAppearance(named: .darkAqua) #endif } This arrangement will allow you to run Xcode and other apps in Light Aqua while debugging your own app in Dark Mode. Check back tomorrow for the next article in the series!
2 days ago
What if we had oracles for common machine learning problems?
Rough working notes, musing out loud. Much effort in machine learning and AI research is focused on a few broad classes of problem. Three examples of such classes are: Classifiers, which do things like classify images according to their category, generalizing from their training data so they can classify previously unseen data in the wild; Generative models, which are exposed to data from some distribution (say, images of houses), and then build a new model which can generate images of houses not in the training distribution. In some very rough sense, such generative models are developing a theory of the underlying distribution, and then using that theory to generalize so they can produce new samples from the distribution; Reinforcement learning, where an agent uses actions to explore some environment, and tries to learn a control policy to maximize expected reward. These are old problem classes, going back to the 1970s or earlier, and each has seen tens of thousands of papers. Each of these problem classes is really beautiful: they’re hard, but not so hard it’s impossible to make progress; they’re precise enough that it’s possible to say clearly when progress is being made; they’re useful, and seem genuinely related to essential parts of the problem of AI. I occasionally wonder, though, what’s the end game for these problem classes? For instance, what will it mean if, in some future world, we’re able to solve the classifier problem perfectly? How much would that help us achieve the goal of general artificial intelligence? What else would it let us achieve? In other words, what happens if you skip over (say) the next few decades of progress in classifiers, or generative models, or reinforcement learning? And they become things you can just routinely do essentially perfectly, perhaps even part of some standard library, much as (say) sorting routines or random number generation can be regarded as largely solved problems today. What other problems then become either soluble, or at least tractable, which are intractable today? Perfect solutions don’t obviously help, even with closely adjacent problems: One obvious point is that you can make a great deal of progress on one of these problems and it doesn’t necessarily help you all that much even with problems which seem closely adjacent. For instance, suppose you can classify images perfectly. That doesn’t necessarily mean that you can solve the image segmentation problem – identifying the
10 days ago
The varieties of material existence
By Michael Nielsen Status: Rough and speculative working notes, very quickly written – basically, a little raw thinking and exploration. Knowledgeable corrections welcome! William James wrote a book with the marvellous title “The Varieties of Religious Experience”. I like the title because it emphasizes just how many and varied are the ways in which a human being can experience religion. And it invites followup questions, like how aliens would experience religion, whether other animals could have religious experiences, or what types of religious experience are possible in principle. As striking as are the varieties of religious experience, they pale beside the variety of material things that can possibly exist in the universe. Using electrons, protons, and neutrons, it is possible to build: a waterfall; a superconductor; a living cell; a Bose-Einstein condensate; a conscious mind; a black hole; a tree; an iPhone; a Jupiter Brain; a working economy; a von Neumann replicator; an artificial general intellignece; a Drexlerian universal constructor (maybe); and much, much else. Each of these is astounding. And they’re all built from arrangements of electrons, protons, and neutrons. As many people have observed, with good enough tweezers and a lot of patience you could reassemble me (or any other human) into a Bose-Einsten condensate, an iPhone, or a black hole. We usually think of all these things as separate phenomena, and we have separate bodies of knowledge for reasoning about each. Yet all are answers to the question “What can you build with electrons, protons, and neutrons?” For the past decade or so, when friends ask me what is the most exciting thing happening in science, one of the subjects I often burble about excitedly is quantum matter – very roughly, the emerging field in which we’re engineering entirely new states of matter, with intrinsically quantum mechanical properties. It turns out there’s far more types of matter, with far weirder properties, than people ever dreamed of. I’m not an expert on quantum matter, I only follow it from afar. Yet what I see makes me suspect something really profound and exciting is going on, something that may, in the decades and centuries to come, change our conception of what matter is. Furthermore, it seems to me that many other very interesting nascent ideas have a similar flavour: things like programmable matter, smart dust, utility fog, synthetic biology, and so on. In a detailed technical sense these
10 days ago
CoreObject is a version-controlled object database for Objective-C that supports powerful undo, semantic merging, and real-time collaborative editing.
27 days ago
Reactive functor
IObservable is (also) a functor.
4 weeks ago
Build a better Bookshelf
Step 1: Install any document scanning app on your phone
Step 2: Scan all the index pages and table of contents in all of your books
Step 3: Send it to whatever software you're using that has OCR feature
Step 4: Now you can search your books digitally
4 weeks ago
Tinderbox: The Tool For Notes
Tinderbox helps you visualize, analyze, and share your ideas.
5 weeks ago
Open Source
Free Form Data Organizer
(Hierarchical Spreadsheet)
5 weeks ago
Stylus Labs
Write is a word processor for handwriting.
5 weeks ago
The History of Aperture
For years, iLife defined the Mac experience, or at the very least, its marketing. An iMac or MacBook wasn't a mere computer; it was a tool for enjoying your music, managing your photos, creating your own songs, editing your home videos, and more. iLife was brilliant because it was approachable. Programs like iTunes, iPhoto, iMovie, iDVD, and GarageBand were so simple that anyone could just open them from the Dock and get started creating.1 Of course, not everyone's needs were met by the iLife applications. iMovie users could upgrade to Final Cut, while Logic was there waiting for GarageBand users. And for those needing more than what iPhoto could provide, Apple offered Aperture. Aperture 1.0 was released in fall 2005. The pitch from Apple was pretty straight-forward: “Aperture is to professional photography what Final Cut Pro is to filmmaking,” said Rob Schoeben, Apple’s vice president of Applications Marketing. “Finally, an innovative post production tool that revolutionizes the pro photo workflow from compare and select to retouching to output.” “Until now, RAW files have taken so long to work with,” said Heinz Kluetmeier, renowned sports photographer whose credits include over 100 Sports Illustrated covers. “What amazed me about Aperture is that you can work directly with RAW files, you can loupe and stack them and it’s almost instantaneous—I suspect that I’m going to stop shooting JPEGs. Aperture just blew me away.” Some may have written off this new program as a Photoshop competitor, but Aperture was really designed to compete with something like Adobe Bridge or later Adobe Lightroom. Ken Rockwell opened his review with this: Aperture is software to help professional photographers select the very best out of hundreds or thousands of similar images. It also helps find relevant images out of the tens (or hundreds) of thousands of images in our archives. It allows fast and fluid sorting and browsing, even with RAW images. As you know, RAW images are slower to open and browse, so before Aperture it was difficult to flip through hundreds, or even dozens, of RAW images instantly. Even here in 2018, RAW files can be difficult to deal with due to their size. In 2005, they were all but impossible to manage. Aperture set out to fix that, as Apple's website said: Featuring a RAW-focused workflow, Aperture makes RAW as easy as JPEG, letting you import, edit, catalog, organize, retouch, publish, and archive your images more effectively and effi
7 weeks ago
Timeline JS
Easy-to-make, beautiful timelines.
8 weeks ago
Don’t pave the paths used by the unhappy cows - Laughing Meme
“Pave the cowpaths, just don’t pave the paths used by the unhappy cows.”
8 weeks ago
The macOS Scripting Triumvirate on Vimeo
Even experts will learn something new, and everyone will gain insight. Join Automation guru Sal Soghoian as he reveals power features of AppleScript, AppleScriptObj-C, and JavaScript for Automation (JXA) that are often hidden in plain sight.
9 weeks ago
More than Automator on Vimeo
There’s more to Automator than the drag-and-drop creation of “automation recipes.” Learn what workflow variables, contextual system integration, and direct access to all of the automation power of the OS can do for you.
9 weeks ago
« earlier      
3d 3dprinting adhesive aesthetics ai algebra algorithm amplifier analytics anodizing ansible appcode appkit apple architecture archive arduino artist audio authentication autocomplete aws backbone.js backpacking backup bag bike blockchain bluetooth book bootstrap buck bundler cad calendar camera canada canvas capacitive capitalism career categorytheory checkers cheese chef china circleci circuits cli climatechange clojure clojurescript clothing cms cocoa coffee coffeescript concurrency containers coq coredata coreimage coretext cpu crdt cs css cucumber currentevents d3 database datamapper datasheet datastructures debugging design dialog distributed diy docker documentary documentation dreamhost dyld eagle ebooks education elixir email ember energy erlang facebook fashion flow focus fodmap fonts food formal-methods forms free french frp functional furniture gallery game gameboy geolocation geometry git github golang google graphics growth-mindset guide habits hackintosh hamburger hardware haskell health heroku history home howto html html5 i2c icons ide ideas ie ifttt images infosec interface interiordesign interview ios iot ipad iphone javascript journal jquery js json keyboard kinect lambda language languages layout learning less liberalism libraries library lightbox lighting lightroom linux lisa lisp list lldb logging mac macosx manual manufacturing map math mechanism meditation microcontroller ml mobile modal modular mongodb monoid multitouch music music-theory mvc nas network networking newton nlp node node.js numpy objective-c ocr opengl opensource opentype opera opinion optimization osx painting papers patterns paypal pcb pdf performance pgp philosophy photography photoshop php physics pi pictograms plastic playgrounds plugin politics polygons postmodernism pottery print processing productivity programming proofs prototype ps2 psychology ptrace purescript pwm python qmatrix quantum race rails rake reading recipe reference regex relationships repl resistive resources resp responsive retina reversing roberto-bolano rome rspec rsync ruby russia rust sass science scipy screencast scripting scroll sec security semigroup sensor sessions sewing shelving shoebox sinatra skim slider slideshow smalltalk snowflake socialism solar soundproofing sourcery speaker speaking spi ssh standards starred statechart statemachine stoicism storage store stripboard stub svg swift swimming syncing synthesis sysadmin table talk tau tcp tech testing text textmate texture threads timelapse timemachine tmux tools tor toread travel tumblr tutorial twitter typography ubuntu ui unix unread vanagon vegan via:adhd360 via:alexbaldwin via:beergeek via:brandonsek via:edogal via:emha via:geekgirl397 via:lukecanvin via:mandrl via:masterjo via:michfern via:mintchaos via:modelcitizen via:orchard via:popular via:samwithans via:sic1 via:sunpig via:votive via:zlmc video vim virtualization visualization visuals warden webapp webdev weldon wireframe woodworking work workflow writing xcode xib

Copy this bookmark: