Swift Development with Visual Studio Code
Visual Studio Code (VSCode) is a cross-platform text and source code editor from Microsoft. It’s one of the most exciting open source projects today, with regular updates from hundreds of contributors. VSCode was among the first tools to support Language Server Protocol (LSP), which has played a large part in providing a great developer experience, in a variety of languages and technologies. With the previously announced support for LSP for Swift now available in early development, it’s a great time to see how this integration works for yourself. So this week, we’ll walk through the process of how to get started with Swift’s new Language Server Protocol support in Visual Studio Code on macOS. If you haven’t tried writing Swift outside Xcode, or are already a VSCode user and new to the language entirely, this article will tell you everything you need to know. Swift support for Language Server Protocol is still in early development and doesn’t currently offer pre-built packages for the language server (sourcekit-lsp ) or the Visual Studio Code extension. For now, downloading, building, and installing these components and their dependencies is a manual process that requires some familiarity with the command line (it should take about 15 minutes on a reasonably fast internet connection). When LSP support is made generally available, it will be much easier to get everything set up. Step 0: Install Xcode If you don’t already have Xcode installed on your machine, open the Terminal app and run the following command: $ xcode-select --install Running this command presents a system prompt. Click the “Get Xcode” button and continue installation on the App Store. Step 1: Install Visual Studio Code Download Visual Studio Code and install it to your system Applications folder. Open the app and follow the instructions for launching from the command line. You’ll need to have the code command accessible from $PATH in order to install the SourceKit-LSP extension later on. Electron apps have a reputation for being big and slow, but don’t let that stop you from giving VSCode a try — it’s performance and memory footprint are comparable to a native app. Step 2: Install the Latest Swift Toolchain Go to Swift.org and download the latest trunk development snapshot (at the time of writing, this was from November 16th, 2018). Once it’s finished downloading, run the package to install the Xcode toolchain. To enable it, open Xcode, select the “Xcode > Pre
18 days ago
Functional architecture: a definition
How do you know whether your software architecture follows good functional programming practices? Here's a way to tell. Over the years, I've written articles on functional architecture, including Functional architecture is Ports and Adapters, given conference talks, and even produced a Pluralsight course on the topic. How should we define functional architecture, though? People sometimes ask me about their F# code: How do I know that my F# code is functional? Please permit me a little detour before I answer that question. What's the definition of object-oriented design? # Object-oriented design (OOD) has been around for decades; at least since the nineteen-sixties. Sometimes people get into discussions about whether or not a particular design is good object-oriented design. I know, since I've found myself in such discussions more than once. These discussions usually die out without resolution, because it seems that no-one can provide a sufficiently rigorous definition of OOD that enables people to determine an outcome. One thing's certain, though, so I'd like to posit this corollary to Godwin's law: As a discussion about OOD grows longer, the probability of a comparison involving Alan Kay approaches 1. Not that I, in any way, wish to suggest any logical relationship between Alan Kay and Hitler, but in a discussion about OOD, sooner or later someone states: "That's not what Alan Kay had in mind!" That may be true, even. My problem with that assertion is that I've never been able to figure out exactly what Alan Kay had in mind. It's something that involves message-passing and Smalltalk, and conceivably, the best modern example of this style of programming might be Erlang (often, ironically, touted as a functional programming language). This doesn't seem to be a good basis for determining whether or not something is object-oriented. In any case, despite what Alan Kay had in mind, that wasn't the object-oriented programming we got. While Eiffel is in many ways a strange programming language, the philosophy of OOD presented in Object-Oriented Software Construction feels, to me, like something from which Java could develop. I'm not aware of the detailed history of Java, but the spirit of the language seems more compatible with Bertrand Meyer's vision than with Alan Kay's. Subsequently, C# would hardly look the way it does had it not been for Java. The OOD we got wasn't the OOD originally envisioned. To make matters worse
18 days ago
Streaming Multipart Requests
Foundation’s URL loading is robust. iOS 7 brought the new URLSession architecture, making it even more robust. However, one thing that it’s never been able to do natively is multipart file uploads. What is a multipart request? Multipart encoding is the de facto way of sending large files over the internet. When you select a file as part of a form in a browser, that file is uploaded via a multipart request. A multipart request mostly looks like a normal request, but it specifies a unqiue encoding for the HTTP request’s body. Unlike JSON encoding ({ "key": "value" } ) or URL string encoding (key=value ), multipart encoding does something a little bit different. Because of the body of a request is just a long stream of bytes, the entity parsing the data on the other side needs to be able to determine when one part ends and another begins. Multipart requests solve this problem using a concept of “boundaries”. In the Content-Type header of the request, you define a boundary: Accept: application/json Content-Type: multipart/form-data; boundary=khanlou.comNczcJGcxe The exact content of the boundary is not important, it just needs to be a series of bytes that is not present in the rest of the body (so that it can meaningfully act as a boundary). You can use a UUID if you like. Each part can be data (say, an image) or metadata (usually text, associated with a name, to form a key-value pair). If it’s an image, it looks something like this: -- Content-Disposition: form-data; name=; filename= Content-Type: image/jpeg And if it’s simple text: -- Content-Disposition: form-data; name= Content-Type: text/plain After the last part in the rqeuest, there’s one more boundary, which includes an extra two hyphens, ---- . (Also, note that the new lines all have to be CRLF.) That’s pretty much it. It’s not a particularly complicated spec. In fact, when I was writing my first client-side implementation of multipart encoding, I was a bit scared to read the RFC for multipart/form-data. Once I took a look at it, however, I understood the protocol a lot better. It’s surprisingly readable, and nice to go straight to the source for stuff like this. That first implementation was in the Backchannel SDK, which is still open source. You can see the BAKUploadAttachmentRequest and the BAKMultipartRequestBuilder , which combine to perform the bulk of the multipart handling code. This particular implementation only handles a single file and no metadata, but it serves a
22 days ago
Components for building interactive charts with D3(v4)
6 weeks ago
Supporting Dark Mode: In-App Web Content
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. If you’re using a web view the way a browser does, to show arbitrary content from the web, then you probably don’t need to do anything special to accommodate Dark Mode. If Dark Mode takes off then maybe we’ll see some kind of CSS standard around presenting web sites for dark presentations, but for the time being users expect web pages to look … like web pages. On the other hand if you’re using web views to support an otherwise native Mac user interface, you’ll want to do something to adapt the default styling of your web content to look appropriate in Dark Mode. For example, I use a web view in my standard about box window. Here’s how it looked in MarsEdit before I adapted any web content to Dark Mode: The two-tone look is kind of cool, but too much of an assault on the eyes for anybody who has really settled into Dark Mode. This content is not like a web page. It’s implemented in HTML to make features such as styling, layout, and links easier to manage, but as far as users are concerned it’s an innate part of this native About Box window. I’ve seen a few approaches to adapting web content to Dark Mode, but most of them relied too heavily on modifying the actual HTML content that was being shown. In my apps, I use web views in several places and, rather than have to jump through hoops in each place to finesse the content for Dark Mode, I thought it would be better if I could come up with a common infrastructure that all web content presenters could use without typically being concerned about appearance. Because my app is using both legacy WebView and modern WKWebView, I had to duplicate my efforts to some extent, but for the purposes of this article I’m going to focus on the approach I took for WKWebView. My solution is rooted in arranging for a web view to call a pertinent JavaScript function when the effective appearance changes. Because WKWebView instances are notified via the viewDidChangeEffectiveAppearance method, I decided to subclass WKWebView to fulfill this contract on behalf of clients: class RSAppearanceSensitiveWKWebView: WKWebView { var didInitialize = false // Override designated initializers to record when we're // done initializing, and avoid evaluating JS until we're done. override init(frame: CGRect, configuration: WKWebViewConfiguration) { super.init(frame: frame, config
8 weeks ago
Supporting Dark Mode: Adapting Images
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. When Apple announced Dark Mode at WWDC 2018, one of my first thoughts was: “Aha! So that’s why Apple has us all switching our icons to template (monochrome) style images.” These images do tend to adapt pretty well to Dark Mode because the system can simply invert them and obtain a fairly usable icon, but the crude inversion doesn’t work well in all cases. There are a lot of details to really getting Dark Mode icons perfect, and Apple talks a lot about this in the Introducing Dark Mode WWDC session. Given my limited time and resources, I took a pragmatic approach to get things looking “good enough” so I could ship a cohesive project while I hope to continue working on refinements in the future. As with colors, asset catalogs can be a great aid in managing image variations for different appearances. Use them if you can, but bear in mind the caveats mentioned in the Adapting Colors section of this series. Most apps will probably require some refinement of toolbar icons, the row of images typically displayed at the top of some windows. In MarsEdit, I was in pretty good shape thanks to a recent overhaul for MarsEdit 4, in which Brad Ellis revised my toolbar icons. Many, but not all, of the images have a templated style aesthetic. Here’s MarsEdit’s main window in Light Mode: Let’s see what happens when just switch to Dark Mode without any special care for the icons: That’s … not so good! Even my vaguely template-style icons are not being treated as templates, so they render with their literal gray colors and look pretty bad in Dark Mode. I realized I could probably do some quick Acorn work and get the template-style icons into shape, but what about the ones with splashes of color? The pencil? Should it still be yellow in Dark Mode? I opted for a pragmatic, stop-gap solution. Without making any changes whatsoever to the graphics files, I worked some magic in code and came up with this: That’s … actually pretty good! So what’s the magic in code I alluded to? I created a custom subclass of NSButton that will optionally set template status on the button’s image only if we’re in Dark Mode. You can see that some of the icons I’ve left untouched, because I felt their colors fit well enough in both dark and light modes. Here’s my custom RSDarkModeAdaptingToolbarButton: class RSDarkModeAdaptingTo
8 weeks ago
Supporting Dark Mode: Appearance-Sensitive Preferences
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. One of the challenges I dealt with in MarsEdit was how to adapt my existing user-facing color preferences to Dark Mode: The default values and any previously saved user customizations would be pertinent only to Light Mode. I knew I wanted to save separate values for Dark Mode, but I worried about junking up my preferences panel with separate labels and controls for each mode. After playing around with some more complicated ideas, I settled on simplicity itself: I would register and save separate values for each user-facing preference, depending on whether the app is in Dark Mode right now. When a user switches modes, the visible color preferences in this panel change to the corresponding values for that mode. I reasoned that users are most likely to use these settings to fine-tune the appearance of the app as it appears now, and it would be intuitive to figure out if they wanted to customize the colors separately for each mode. How did I achieve this? I decided to bottleneck access to these preferences so that as far as any consumer of the preference is concerned, there is only one value. For example, any component of my app that needs to know the “text editing color” consults a single property “bodyTextColor” on my centralized preferences controller. This is what the supporting methods, along with the property accessor, look like: func color(forKey key: String, defaultColor: NSColor) -> NSColor { let defaults = UserDefaults.standard guard let color = defaults.rsColor(forKey: key) else { return defaultColor } return color }func setColor(color: NSColor, forKey key: String) { let defaults = UserDefaults.standard defaults.rsSetColor(color, forKey: key) }var bodyTextPreferenceKey: String { if NSApp.isDarkMode { return bodyTextColorForDarkModeKey } else { return bodyTextColorForLightModeKey } }@objc var bodyTextColor: NSColor { get { let key = self.bodyTextPreferenceKey return self.color(forKey: key, defaultColor: .textColor) } set { let key = self.bodyTextPreferenceKey self.setColor(color: newValue, forKey: key) self.sendNotification(.textColorPreferenceChanged) } } Because the getter and setter always consult internal properties to determine the current underlying defaults key, we always write to and read from the semantic value that the user expects. The only thing the user interface for my
8 weeks ago
Supporting Dark Mode: Adapting Colors
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. Given the dramatic visual differences between appearances, virtually every color in your app will need to be varied at drawing time to work well with the current appearance. Use Semantic Colors The good news is all of Apple’s built-in, semantically named colors are automatically adapted to draw with a suitable color for the current appearance. For example, if you call “NSColor.textColor.set()” in light mode, and draw a string, it will render dark text, whereas in dark mode it will render light text. Seek out areas in your app where you use hard-coded colors and determine whether a system-provided semantic color would work just as well as, or better than, the hard-coded value. Among the most common fixes in this area will be locating examples where NSColor.white is used to clear a background before drawing text, when NSColor.textBackgroundColor will do the job better, and automatically adapt itself to Dark Mode. Vary the Color at Drawing Time In scenarios where a truly custom color is required, you have a few options. If the color is hard-coded in a drawing method, you may be able to get away with simply querying NSAppearance.current from the point where the drawing occurs. For example, a custom NSView subclass that simply fills itself with a hard-coded color: override func draw(_ dirtyRect: NSRect) { super.draw(dirtyRect) // Custom yellow for light mode, custom purple for dark... let lightColor = NSColor(red:1, green:1, blue:0.8, alpha:1) let darkColor = NSColor(red:0.5, green:0.3, blue:0.6, alpha:1) if NSAppearance.current.isDarkMode { darkColor.setFill() } else { lightColor.setFill() } dirtyRect.fill() } Use Asset Catalogs Special cases like above are fine when you draw your own graphics, but what about views that the system frameworks draw for you? For example, because NSBox supports drawing a custom background color, you’re unlikely to implement a custom view exactly like the one above. Its functionality is redundant with what NSBox provides “for free.” But how can you ensure that NSBox will always draw in the right color for the current appearance? It only supports one “fillColor” property. A naive approach would involve paying attention to changes in appearance, and re-setting the fill color on NSBox every time. This would get the job done, but is more complicated and error-prone t
8 weeks ago
Supporting Dark Mode: Responding to Change
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. To support Dark Mode elegantly in your app, you need to not only initialize your user interface appropriately for the current appearance, but also be prepared to adapt on the fly if the user changes appearance after your interface is already visible on the screen. In situations where you use semantic colors or consult the current appearance at drawing time, there’s nothing else to be done. Since changing the appearance invalidates all views on the screen, your app will redraw correctly without any additional intervention. Other situations may demand more customized behavior. I’ll give a couple examples later in this series, but for now just take my word that you may run into such scenarios. If you have a custom view that needs to perform some action when its appearance changes, the most typical way to handle this is by implementing the “viewDidChangeEffectiveAppearance” method on NSView. This is called immediately after your view’s effectiveAppearance changes, so you can examine the new appearance, and update whatever internal state you need to before subsequent drawing or event handling methods are called: class CustomView: NSView { override func viewDidChangeEffectiveAppearance() { // Update appearance related state here... } } If you need to react to appearance changes in the context of an NSViewController, your best bet may be to use Key Value Observing (KVO) to observe changes to the “effectiveAppearance” property on the view controller’s view. This approach will set your view controller up to be notified around the same time the view itself would be. In some situations it’s more interesting to know whether, at a high level, the application’s appearance has shifted in a significant way. For this you can observe “effectiveAppearance” on the NSApplication.shared instance. There is no standard NotificationCenter notification for this, but I found it useful enough in my code bases that I added my own. By centralizing KVO observation of NSApp.effectiveAppearance, I can translate all such changes into NotificationCenter-based broadcasts that any component of my app can observe. To implement something like this, first declare an NSNotification.Name extension in some high-level class such as your application delegate: extension NSNotification.Name { public static let appAppearanceChanged
8 weeks ago
Supporting Dark Mode: Checking Appearances
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. As you adapt your app to support Dark Mode, you may run into situations where you need to determine, in code, what the active appearance is. This is most typical in custom views that vary their drawing based on the current appearance, but you may also need to test higher level appearance traits to determine, for example, whether the whole application is being run “in Dark Mode” or not. Current and Effective Appearances To work effectively with Dark Mode support, you must appreciate the distinction between a few key properties that are all of type NSAppearance. Take some time to read the documentation and watch the videos I referenced in Educational Resources, so you really understand them. Here is a capsule summary to use as reference in the context of these articles: appearance is a property of objects, such as NSApplication, NSWindow, and NSView, that implement the NSAppearanceCustomization protocol. Any such object can have an explicit appearance deliberately set on it, affecting the appearance of both itself and any objects that inherit appearance from it. As a rule, views inherit the appearance of their window, and windows inherit the appearance of the app. effectiveAppearance is a property of those same objects, taking into account the inheritance hierarchy and returning a suitable appearance in the likely event that no explicit value has been set on the object. NSAppearance.current or +[NSAppearance currentAppearance] is a class property of NSAppearance that describes the appearance that is currently in effect for the running thread. Practically speaking you can think of this property as an ephemeral drawing variable akin to the current fill color or stroke color. Its value impacts the manner in which drawing that is happening right now should be handled. Don’t confuse it with high-level user-facing options about which mode is set for the application as a whole. High-Level Appearance Traits As you modify your code to respect the current or effective appearance, you will probably need to make high level assessments like “is this appearance light or dark?” Because of the aforementioned complication that there are many types of NSAppearance, that they can be nested, etc., it’s not possible to simply compare the current appearance with a named appearance. Instead, you use a method on N
8 weeks ago
Supporting Dark Mode: Opting In
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. To run any app in Dark Mode, you must to be running macOS Mojave 10.14 or greater. By default, all existing apps will run in Light Mode even if the system is configured to run in Dark Mode. An app that is launched on macOS Mojave will run in Dark Mode when two criteria are met: The system considers the app to be compatible with Dark Mode The running application’s appearance is set to Dark Aqua An app’s compatibility with Dark Mode is determined by a combination of the SDK it was built against, and the value of the “NSRequiresAquaSystemAppearance” Info.plist key. If your app is built against the 10.14 SDK or later, it will be considered compatible unless the key is set to NO. If your app is built against the 10.13 SDK or earlier, it is considered incompatible unless the Info.plist key is set to YES. When a compatible app is launched, its appearance is set to match the user’s system-wide preference, as selected in the System Preferences’s “General” tab. To streamline development, Apple also provides a switch in Xcode itself so that the appearance of a running app can be switched on the fly without affecting the appearance of other apps running on your Mac. Although the Xcode switch is handy for making quick comparisons between modes, there is not, as far as I know, any mechanism to always launch an app from Xcode in Dark Mode when the system is in Light Mode, or vice-versa. If you strongly prefer one mode over the other, you may want to build in affordances to your app that support debugging in “the other mode” when you need to. For example, in the build settings for your app, find “Other Swift Flags,” and add “-DDEBUG_DARK_AQUA”: Then, somewhere early in your app’s launch, you can conditionally force the appearance if specified: func applicationDidFinishLaunching(_ notification: Notification) { #if DEBUG_DARK_AQUA NSApp.appearance = NSAppearance(named: .darkAqua) #endif } This arrangement will allow you to run Xcode and other apps in Light Aqua while debugging your own app in Dark Mode. Check back tomorrow for the next article in the series!
8 weeks ago
Supporting Dark Mode: Educational Resources
This article is part of a series on Supporting Dark Mode in macOS applications. For the best experience, start with the Introduction and work your way through. The first trick to tackling any new challenge is getting your technical references sorted. My advice to all developers is to watch the pertinent WWDC sessions, read the high-level documentation, and to immerse oneself in the Dark Mode aesthetic by perusing other applications that support it: WWDC Sessions It goes without saying that every Mac developer should watch What’s New in Cocoa for macOS. This is always a great overview that touches on the major changes to follow up on in other sessions. For Dark Mode in particular, be sure to watch both the Introducing Dark Mode and Advanced Dark Mode sessions. These include a lot of excellent advice about both high-level design considerations and low-level practical uses of the NSAppearance API. Documentation Apart from the reference documentation on NSAppearance, be sure to read the longer-form Supporting Dark Mode in Your Interface, and Providing Images for Different Appearances. These high-level guides will orient you to the types of work you will likely need to do in your app. Finally, the macOS 10.14 Release Notes include a number of details about Dark Mode, and particularly about special background blending modes and how they affect the user interface of an application. Immerse Yourself Although dark interfaces are nothing new, Apple’s official take on it with macOS Mojave establishes specific aesthetic choices. You’ll want to become acquainted with the decisions Apple has made so you can make the right call in your own app. I recommend switching your macOS Mojave Mac to Dark Mode and running as many apps as possible in Dark Mode to get a sense for the prevailing aesthetics. The vast majority of Apple’s own apps have been tastefully adapted, and a growing number of 3rd party titles are listed on the Mac App Store as Apps That Look Great in Dark Mode. I’m honored that one of my own apps, MarsEdit, is listed there. Check back tomorrow for the next article in the series!
9 weeks ago
Dark Mode Series: Introduction
I spent a good part of the summer learning about macOS Mojave’s new Dark Mode theme, and how Mac apps can support the theme both in technical and practical ways. I adapted MarsEdit, Black Ink, FlexTime, and FastScripts to the new interface style. During that process, I learned a lot about where to look for advice, and how to handle common scenarios. I’d like to share that advice with folks who have yet to undertake this work. The gist of what I have to share comes from tackling challenge after challenge in my own apps. Some interfaces adapted effortlessly to Dark Mode, some needed only a little finessing, while others demanded relatively hard-core infrastructural changes. My advice will focus on the dichotomy of Light Mode and Dark Mode. The Mac’s appearance support is more nuanced than that. NSAppearance supports a hierarchy of appearances that build upon one another. The light and dark modes are the two most prominent user-facing examples, but variations such as high contrast modes should also be considered. These articles are loosely organized in order from more fundamental to more arcane, with a priority on establishing knowledge and techniques in earlier articles that you may need to reference in later articles. Feel free to jump around if you’re looking for something special. So, let’s get started! The first article introduces you to a few educational resources that you should consult both before and while you’re working on adding Dark Mode support to your apps: Dark Mode Series: Educational Resources Dark Mode Series: Running in Dark Mode Dark Mode Series: Current and Effective Appearance Dark Mode Series: Responding to Appearance Changes Dark Mode Series: Adapting Colors Dark Mode Series: Appearance-Specific Preferences Dark Mode Series: Adapting Images Dark Mode Series: Adapting Web Content Dark Mode Series: Dynamic Background Tinting Check back every day, or subscribe to the blog’s feed, to keep up with the articles as they are posted.
9 weeks ago
What if we had oracles for common machine learning problems?
Rough working notes, musing out loud. Much effort in machine learning and AI research is focused on a few broad classes of problem. Three examples of such classes are: Classifiers, which do things like classify images according to their category, generalizing from their training data so they can classify previously unseen data in the wild; Generative models, which are exposed to data from some distribution (say, images of houses), and then build a new model which can generate images of houses not in the training distribution. In some very rough sense, such generative models are developing a theory of the underlying distribution, and then using that theory to generalize so they can produce new samples from the distribution; Reinforcement learning, where an agent uses actions to explore some environment, and tries to learn a control policy to maximize expected reward. These are old problem classes, going back to the 1970s or earlier, and each has seen tens of thousands of papers. Each of these problem classes is really beautiful: they’re hard, but not so hard it’s impossible to make progress; they’re precise enough that it’s possible to say clearly when progress is being made; they’re useful, and seem genuinely related to essential parts of the problem of AI. I occasionally wonder, though, what’s the end game for these problem classes? For instance, what will it mean if, in some future world, we’re able to solve the classifier problem perfectly? How much would that help us achieve the goal of general artificial intelligence? What else would it let us achieve? In other words, what happens if you skip over (say) the next few decades of progress in classifiers, or generative models, or reinforcement learning? And they become things you can just routinely do essentially perfectly, perhaps even part of some standard library, much as (say) sorting routines or random number generation can be regarded as largely solved problems today. What other problems then become either soluble, or at least tractable, which are intractable today? Perfect solutions don’t obviously help, even with closely adjacent problems: One obvious point is that you can make a great deal of progress on one of these problems and it doesn’t necessarily help you all that much even with problems which seem closely adjacent. For instance, suppose you can classify images perfectly. That doesn’t necessarily mean that you can solve the image segmentation problem – identifying the
9 weeks ago
The varieties of material existence
By Michael Nielsen Status: Rough and speculative working notes, very quickly written – basically, a little raw thinking and exploration. Knowledgeable corrections welcome! William James wrote a book with the marvellous title “The Varieties of Religious Experience”. I like the title because it emphasizes just how many and varied are the ways in which a human being can experience religion. And it invites followup questions, like how aliens would experience religion, whether other animals could have religious experiences, or what types of religious experience are possible in principle. As striking as are the varieties of religious experience, they pale beside the variety of material things that can possibly exist in the universe. Using electrons, protons, and neutrons, it is possible to build: a waterfall; a superconductor; a living cell; a Bose-Einstein condensate; a conscious mind; a black hole; a tree; an iPhone; a Jupiter Brain; a working economy; a von Neumann replicator; an artificial general intellignece; a Drexlerian universal constructor (maybe); and much, much else. Each of these is astounding. And they’re all built from arrangements of electrons, protons, and neutrons. As many people have observed, with good enough tweezers and a lot of patience you could reassemble me (or any other human) into a Bose-Einsten condensate, an iPhone, or a black hole. We usually think of all these things as separate phenomena, and we have separate bodies of knowledge for reasoning about each. Yet all are answers to the question “What can you build with electrons, protons, and neutrons?” For the past decade or so, when friends ask me what is the most exciting thing happening in science, one of the subjects I often burble about excitedly is quantum matter – very roughly, the emerging field in which we’re engineering entirely new states of matter, with intrinsically quantum mechanical properties. It turns out there’s far more types of matter, with far weirder properties, than people ever dreamed of. I’m not an expert on quantum matter, I only follow it from afar. Yet what I see makes me suspect something really profound and exciting is going on, something that may, in the decades and centuries to come, change our conception of what matter is. Furthermore, it seems to me that many other very interesting nascent ideas have a similar flavour: things like programmable matter, smart dust, utility fog, synthetic biology, and so on. In a detailed technical sense these
9 weeks ago
« earlier      
3d 3dprinting adhesive aesthetics ai algebra algorithm amplifier analytics anodizing ansible appcode appkit apple architecture archive arduino artist audio authentication autocomplete aws backbone.js backpacking backup bag bike blockchain bluetooth book bootstrap buck bundler cad calendar camera canada canvas capacitive capitalism career categorytheory checkers cheese chef china circleci circuits cli climatechange clojure clojurescript clothing cms cocoa coffee coffeescript concurrency containers coq coredata coreimage coretext cpu crdt cs css cucumber currentevents d3 database datamapper datasheet datastructures debugging design dialog distributed diy docker documentary documentation dreamhost dyld eagle ebooks education elixir email ember energy erlang facebook fashion flow focus fodmap fonts food formal-methods forms free french frp functional furniture gabrielflor.it gallery game gameboy geolocation geometry git github golang google graphics growth-mindset guide habits hackintosh hamburger hardware haskell health heroku history home howto html html5 i2c icons ide ideas ie ifttt images infosec interface interiordesign interview ios iot ipad iphone javascript journal jquery js json keyboard kinect lambda language languages last.fm layout learning less liberalism libraries library lightbox lighting lightroom linux lisa lisp list lldb logging mac macosx manual manufacturing map math mechanism meditation microcontroller ml mobile modal modular mongodb monoid multitouch music music-theory mvc nas network networking newton nlp node node.js numpy objective-c ocr oil opengl opensource opentype opera opinion optimization osx painting papers patterns paypal pcb pdf performance pgp philosophy photography photoshop php physics pi pictograms plastic playgrounds plugin politics polygons postmodernism pottery print processing productivity programming proofs prototype ps2 psychology ptrace purescript pwm python qmatrix quantum race rails rake reading recipe reference regex relationships repl resistive resources resp responsive retina reversing roberto-bolano rome rspec rsync ruby russia rust sass science scipy screencast scripting scroll sec security semigroup sensor sessions sewing shelving shoebox sinatra skim slider slideshow smalltalk snowflake socialism solar soundproofing sourcery speaker speaking spi ssh standards starred statechart statemachine stoicism storage store stripboard stub svg swift swimming syncing synthesis sysadmin table talk tau tcp tech testing text textmate texture threads timelapse timemachine tmux tools tor toread travel tumblr tutorial twitter typography ubuntu ui unix unread vanagon vegan via:adhd360 via:alexbaldwin via:beergeek via:brandonsek via:edogal via:emha via:geekgirl397 via:lukecanvin via:mandrl via:masterjo via:michfern via:mintchaos via:modelcitizen via:orchard via:popular via:samwithans via:sic1 via:sunpig via:votive via:zlmc video vim virtualization visualization visuals warden webapp webdev weldon wireframe woodworking work workflow writing xcode xib

Copy this bookmark: