John Siracusa’s OS X reviews
Today John Siracusa announced that he won’t be making more OS X reviews.
Typically journalists or reviewers don’t announce that they stop doing something. They just stop doing it, and maybe explain it after someone asks them.
But John’s reviews were something truly special, and a lot of people on the tech world has lamented the announcement.
I think that the Mac community has always been quite vibrant and passionate, allowing detailed discussion. Crossing the line of obsession sometimes. In other tech worlds, the discussion is more cold and rational, even aseptic. Apple discussion has always been more emotional and sort of aiming for greatness. Years ago the distinction was quite pronounced, when Apple had a small and passionate group of followers.
Tech reviews tend to be on two extremes. Either too much spec and features enumeration, making them arid and boring. Or too much focused in fitting a narrative, talking about whether something is the best or the worst ever.
But these OS X reviews are the perfect combination of both aspects, and they remain the gold standard of tech literature.
John’s reviews are extremely detailed, even to the level of obsession, but yet they are easy to read and understand.
They present a clear exposure of new features. But add historical context of decisions, compromises and forecasting areas of improvement.
Siracusa has strong opinions in quite a few areas, but he’s exposing all the facts, and explaining his biases.
I can’t stress how difficult is keep you hooked and eager to read another 8,000 words, when you’re talking in length about filesystems, pixel alignment, background process handling or Finder performance.
I haven’t found anything that can be on par, though there are great writers writing about technology these days. I can read better reviews now than 5 or 10 years ago.
While I understand his reasons for stopping, and think we have been lucky to have them around for so long, I can’t help but feel a little sad.
I’ll keep following him in his blog, twitter, and of course on his weekly podcast, as well as collaborations. At least, John Siracusa keeps a healthy production that we can consume. I’m sure we’ll keep reading and listening great stuff.
Fascinating
The importance of Spock as an icon of the XX century cannot be overstated.
Let me go back for a second. When I was a child in Spain, access to Star Trek was pretty limited. There were no reruns of the original series. I think it only was emitted during the 70s in Black and White. The Next Generation was broadcasted with a four or five year delay. And it stopped for several years just at the end of the third season. It took me years to know what happened with Picard and Locutus! Also, the movies were not big hits.
But everyone knew who Spock was. A lot of people told my uncle he looked like “Dr” Spock. He beared a clear resemblance with Leonard Nimoy when he was young.
In a lot of ways I’ve always consider Star Trek my “prime geeky love“. That iconic status and mystery, made me search for it. After having access to Internet, I devour everything about it. It was “my thing“, more than other stuff I love, because it was less known among my circles. I bought the whole DS9 series in VHS from UK, over the course of several years. I say that I learned spoken English through Star Trek. And written English through RPG manuals.
Spock is the most genuine example of “cool scientist“, in opposition from the “mad scientist” trope . Probably the first example. And was the one getting the spotlight in the series. He was the character to follow.
He is iconic to a level that only competes with a bunch of icons. I’ve seen him in posters of parties or t-shirts in big stores. He’s know by people that don’t even know what Star Trek is.
And none of this should been possible without the extraordinary performance by Leonard Nimoy. It is clear that he contributed heavily to a lot of things that define Spock. From the Vulcan Nerve Pinch to the Vulcan salute. But his biggest legacy is the fact that his performance is replicated constantly. Every single Vulcan in Star Trek is impersonating Leonard Nimoy playing Spock. In tons of other media we have similar interpretations of “emotionless aliens/outsiders“. He created an archetype, which is a huge achievement. That is also a heavy burden, and Leonard Nimoy seemed only to embrace it after years of struggle.
There is also an overlooked idea in Vulcan philosophy that has always been appealing to me. IDIC. Infinite Diversity in Infinite Combinations. It resonates in me in a lot of diff
erent levels. On the personal level, on how important is to embrace the diversity that us humans are capable of produce.
And to the software creation, on how wonderful is the combination of different code, assembled to produce something magical together.
Leonard Nimoy will be greatly missed. Rest In Peace.
Life long and prosper.
Compendium of Wondrous Links vol VIII
More great reads!
About code creation
- It seemed like a good idea at the time. How tech decisions done at some point in time can have a big impact much much later. Unfortunately, this is unavoidable, developing software is based in dealing with imperfect information all the time.
- Fear Driven Development.
- Dealing with different languages is difficult in programs (and otherwise).
- Seven Laws of Sane Personal Computing.
- Great compilation of Python libraries that deserve to be widely used.
- Debug like Sherlock Holmes. One of my favourite ways of thinking on debug is: “Once you eliminate the impossible, whatever remains, no matter how improbable, must be the truth.“
- How to work with Hexagonal grids. Amazing article and a must read if you want to work with these kind of grids for a game.
- The sed FAQ. One of my resolutions for this year is try to learn sed better and use it more.
- Small general-purpose commands that can be combined to compose larger commands. The lesson of Vim. Interesting follow up here.
- git is the epitome of a UNIX tool: powerful, tied to the command line, high learning curve, amazing. Hidden consistency.
- A talk on how to program coroutines in Python through generators.
- Using command line tools for map reduce. Piping data is incredibly powerful. Also, GNU Parallel is an amazing tool.
- Use crypto to avoid database writes.
The job of developing
- Speed in Software Development. Lengthy article about the different paces in developing software, and impact of different factors. I’ve already talked on how I think that “sprint“ is a bad name.
- Goals in code coverage. Metrics should be subject to analysis to focus the meaningful objective.
- While I understand the concept, I don’t really like the word talent. Like every overused word, It can hide very dysfunctional things.
- Simplifying the problems doesn’t usually look impressive. The Parable of Two Programmers.
- Bug finding is slow in spite of many eyeballs.
- Feature requests that look simple. How a taxation change took 4 developers a week to handle.
- Performance reviews are a complicated part of the manager-managed relationship. What your manager really thinks of you.
- Great advice for managers: 44 engineering management lessons.
- Why sometimes I hate myself, on the self imposed pressure of comparing yourself with the (unreal) public faces of other. Another article related.
- I still can’t see why Open Space plans are still the standard in new companies.
- Fascinating read. God’s Lonely Programmer.
- Good tips on writing a technical CV.
- Computers are awesome!
- “Forty years later Weinberg’s “egoless” approach, in which mistakes are accepted as inevitable and reviews are performed in a collegial way, remains the sanest way to produce code” “Rock Star” Programmers
Other stuff
- Amazing models from the original Star Wars trilogy.
- And how sound effects were done back in the day. Millennium Falcon’s hyperdrive malfunction.
- Also, preproduction concept images for Star Wars.
- Raiders of the Lost Ark in Black and White. It looks great.
- Mastering records and why do all records sound the same.
- Using the Nintendo Power Glove as an Stop-Motion animation tool. Amazing creative use of a tool for something completely different that it was created.
- The problem measuring financial assets and determine how much benefit (or loss) on a bank.
- Why adventure games suck and what we can do about it. 1989’s article by Ron Gilbert. Fantastic.
- The iPod classic has been discontinued. We probably won’t have another device with the sole intent of listening to music. On Death and iPods: A Requiem.
- How Bezier curves are drawn.
- Get exposure of an indie game is difficult. Some advice on it.
- DDoS attacks are growing to be very scary. A good article about how they work. The visualisation is mesmerising.
- What do your password say about you? The secret life of passwords.
- A Brief History of Databases.
- Erlang the movie. Every programming language should do one of these. I love that they make the demo on phones, as the erlang is a measure of phone load.
- IBM model M. Arguably the most famous keyboard ever made.
- Databases a 14.4MHz. Impressive benchmark.
TV interface
Isn’t it quite absurd that we haven’t nail this yet?
We recently heard about all the great advances in terms of image quality, 4K, bending screens… Yet controlling a TV feels clunky and awkward.
Even worse, given the never ending increase in devices connected to the TV (DVDs, Blu Rays, AppleTVs, consoles…) the usage of different remote controls is painful, unless a Universal Remote is used. But even in that case, the process itself of selecting the activity is weird, for very common operations.
For example, if you want to play a movie on an AppleTV (or DVD player, or TiVO, etc), you may have to turn it on, then (using the remote), turn on the TV, select the input (which may involve cycle through a lot of unused inputs, like Composite), adjust the volume, and then get the AppleTV controller.
If at any point we need to adjust the volume, we’ll need to get again the TV remote.
All this is very weird and not the best user experience…
So, just for the sake of discussion, let’s say a couple of ideas for what I think could be a TV with a much better interface:
– Everything is a channel. Why HDMI is “an input” but TV channels are numbered? Allowing to select HDMI input as channel 3 allow easy access to it, and simplifies the interface.
If there is no input (e.g. the DVD is off), the channel will be skipped for channel surfing purposes, unless the user specifically request the channel number.
– A good remote. I don’t think that there’s any excuse for not being Universal and Programable Remote. The included TV Remote should aim to control more things than just the TV. It should have as few buttons as possible, more on that later. And, another important feature, it should be possible to make it beep if lost (e.g. clicking a button on the TV)
– Programmable channels. When a channel is on selected, all the settings (mainly image controls, but may also include a volume adjustment) will change accordingly. The remote will also know that the controls are now related to this channel. For example, if the DVD channel is selected, most of the keys are related to the DVD remote (except volume keys); if the AppleTV channel is selected, the keys refer to the AppleTV controls.
A list of channels (ideally showing their current input) that allows easy rearrangement and enabling disabling them will be great to navigate and give an overview.

An interesting (and useful) design, a two-sided remote on the Chromebox. Obviously, not from a TV manufacturer
– A companion app that allows to configure all the parameters easily on a PC or mobile app.
I understand that the complexities of the different devices in the living room is complex, and that there are problems that can be impossible to fix (for example, having an HDMI router may still present problems given that more than one “channel” will be coming from the same “input”)
But I find quite ridiculous that we are still dealing with such terrible interfaces in TVs these days…
Future as a developer and the ever changing picture
A few weeks ago I came by a couple of articles my Marco Arment that share the theme of the current status of accelerated change within the development community as a way of stressing up, and being difficult to be up to date. After all, one gets tired of learning a new framework or language every size months. It gets to a point where is not funny or interesting anymore.
It seems like two different options are presented, that are available for developers after some time:
- Keep up, meaning that you adopt rapidly each new technology
- Move to other areas, typically management
Both are totally valid options, as I already said in this blog that I don’t like when good developers move to different areas (to me it’s sort of a surgeon deciding she had enough after a few years and move to manage the hospital). Though, obviously each person has absolutely every right to choose their career path.
But I think that it’s all mostly based in an biased and incorrect view of the field of technology and the real pace of changes.
In the last years, there has been an explosion of technologies, in particular for web. Ruby on Rails almost feels introduced at the same time as COBOL. NodeJS seemed to be in fashion for a while. The same with MongoDB or jQuery.
In the last 6 or 7 years there has been an incredible explosion in terms of open source fragmentation. Probably because GitHub (and other online repos) and the increase in communication through the Internet, the bar to create a web framework and offer it to the world has been lowered so much, that a lot of projects that would’ve been not exposed previously, has gotten more exposure. As a general effect, is positive, but it came with the negative effect that every year there is a revolution in terms of technologies, which forces everyone to catch up and learn the brand new tool that is the best for the current development, increasing the churning of buzz words.
But all this is nothing but an illusion. We developers tend to laugh at the common “minimum 3+ years of experience in Swift”, but we still get the notion that we should be experts in a particular language, DB or framework since day one. Of course, of the one on demand today, or we are just outdated, dinosaurs that should retire.
Software development is a young field, full of young people. That’s great in a lot of aspects, but we need to appreciate experience, even if it comes from using a different technology. It doesn’t look like it, but there’s still a lot of projects done in “not-so-fancy” technologies. That includes really old stuff like Fortran or COBOL, but also C++, Java, Perl, PHP or Ruby.
Technologies gets established by a combination of features, maturity, community and a little luck. But once they are established, they’re quite resilient and don’t go away easily. They are useful for quite a long time. Right now it’s not that difficult to pick a tool that is almost guaranteed to be around in the next 10-15 years. Also, most of the real important stuff is totally technology agnostic, things like write clean code, structure, debug ability, communication, team work, transform abstract ideas into concrete implementations, etc… That simply does not go away.
Think about this. iOS development started in 2008. Smartphones are radically different beasts than the ones available 6 years ago, probably the environment that has changed more. The basics are the same, though. And even if Swift has been introduced this year, it’s based in the same principles. Every year there has been tweaks, changing APIs, new functionalities. But the basic ideas are still the same. Today a new web development using LAMP is totally viable. Video games still relay on C++ and OpenGL. Java is still heavily used. I use all the time ideas mainly developed in the 70s like UNIX command line or Vim.
Just because every day we get tons of news about new startups setting up applications on new paradigms, that doesn’t mean that they don’t coexist with “older” technologies.
Of course, there are new tricks to learn, but it’s a day by day additive effort. Real revolution and change of paradigm is rare, and normally not a good sign. Changing from MySQL to PostgreSQL shouldn’t be considered a major change in career. Searching certain stability in the tools you use should be seen as good move.
We developers love to stress the part of learning everyday something new and constantly challenge ourselves, but that should be taken also in perspective with allowing time to breathe. We’ve created a lot of pressure on ourselves in terms of having to be constantly pushing with new ideas, investigating in side projects and devoting ourselves 100% of the time to software. That’s not only not realistic. It’s not good.
You only have to breathe. And just worry on doing a good work and enjoy learning.
Compendium of Wondrous Links VI
- They finally found all those buried Atari cartridges, and confirmed a beloved urban legend. Just wonderful.
- This episode of @ExtraCreditz follows up an idea I always had about education. The key is being demanding, but allowing a lot of opportunities.
- Amazing book introduction, showing how no one is immune to think that they are stupid. Lots of things in live are hard.
- Readability in code is not about being literary. Is about making the code easy to understand. You don’t read code, you explore it.
- The Great Works of Software. The premise is extremely interesting. What are the most influential pieces of software?
- The hilarious (is funny because it’s true) Programming Sucks and a follow-up What programming is Like.
- Is programming a dead end job? I still can’t help but feel sad each time that a (good) developer decides to move into management.
- It’s easy to forget how much the things have changed in term of software distribution. What Writing and Selling Software Was Like in the 80’s (yep, also from The Codist. You should subscribe)
- The computer world is very dominated by English, and even so with latin alphabet. This idea about making a computer language in Arabic is fascinating. It not only shows how difficult is to set up an environment without problems out of “the ASCII world” (the magnitude is not comparable, but trying to code in languages like French or Spanish has a lot of friction), but it also shows up how alien (yet beautiful) a different alphabet looks. I wonder how code and programming will be if the dominant language would’ve been something like Chinese or Arabic.
- What is the “Agile mindset” anyway? The graph is very interesting. Specially the “Chaos labeled as Agile” side.
- I don’t really like the idea of “rivalry” against Vim and Emacs. I prefer to consider them two valid options. But this article goes into explaining their different appeals and why they have been around since an extremely long time ago in computer-years.
- 10 Most common Python mistakes. Good to check.
Visual Programming and Mental Constructs
I saw yesterday live the Apple keynote on the WWDC. I am far from an Apple developer, but I use OS X and iOS everyday, and I’m interested on new stuff. There was a full section devoted to developers, which is great (well, it’s supposed to be a developer’s conference, after all), and, arguably, the most interesting stuff on that part (for a developer’s perspective) was the release of a new programming language, Swift.
It was announced with an (irrelevant) comparison with Python in terms of speed (I actually have plans to write a post about “why Python is not really slow“, but I digress), as well as a lot of other details that (IMO) are completely pointless in terms of what makes a good or bad programming language.
I am generally skeptic about the announcement of new languages. Almost as much as new web frameworks. Sure, it adds a new flavour, but I’m not that sure about real advancement in tech. Creating a new language, full with proper “clean and beautiful” syntax is not really that difficult. The difficult part is to create a vibrant community behind it, one that loves the language and works to expand it, to push the boundaries of current tech, to make amazing applications and tools, to convince other developers to use it and to carry on the torch. The target of a language are developers. “End customers” couldn’t care less about how the guts of their products are done. “Ruby sharp? Whatever, I just need that it help us increase our sales“
Interestingly enough, languages get a lot of character from their communities, as they embed their values on the relevant modules and tools. A great example of that is “The Zen Of Python“. There’s nothing there about whitespaces, list comprehensions or classes, but it reflects a lot of the ideas that are common on the Python world, values of the Python Community. Using a language is not just writing code, but also interacting with other developers, directly or even just reading the documents and using the APIs.
Obviously, Apple is a very special situation, as it can force developers to use whatever they like for their platform. Hey, they managed to create an Objective-C ecosystem out from nowhere, which is impressive. For what is worth, they can even tailor a language for their platform, and not to worry about anything else. iOS is a platform big enough for devs to have to learn the language and official IDE and use it. And I am pretty sure that in this case it will be an improvement over the previous environment.
But the one part that I am most skeptic about is the “visual programming” stuff. One of the “wow” announcements was the possibility of creating “playgrounds”, to show interactively the results of the code. That means that, for example, a loaded image will be available, or that a graph can be displayed showing the results of a function. And that’s the part that I’m not really that sure that is interesting or relevant at all.
Does it look cool? Absolutely. May it be interesting once in a while? Sure. But I think that’s the kind of process that, in day to day operation, is not really that useful in most kinds of programming.
Programming, more than anything else, is creating a mental image of code. Code can be a very complex thing. Especially on a big application. But normally we don’t need to keep the whole code in our mind. We only have to keep certain parts of it, allowing to focus in a problem at a time. That’s the main principle behind modules, classes and other abstractions. I can use OS calls to open a file, to draw some pixels on the screen, or to make a call to a remote server. All of that without having to worry about file systems, graphic drivers or network protocols. And I can also use higher level modules to search on files, create 3d models or make HTTPS calls.
And the amazing power of programming is that you are coding on the shoulders of giants. And on the shoulders of regular people. And on the shoulders of your co-workers. And on your own shoulders. That’s a lot of shoulders combined.
But a lot of that process deals with the unavoidable complexity of the interaction. And being able to move from an abstracted view to a more specific one, to look inside and outside the black box, is crucial. It may not be evident, but the mental process of programming deals a lot with that sudden change in perspective. This is one of the reasons of multiparadigm being a useful thing. Because you can move between different abstractions and levels, using the proper one on each case (especially for leaky ones).
And there are lots of those processes that are not easily represented with graphs or images. They are constructs on your mind: loops, flexible structures, intuitions on the weak points of an algorithm, variables changing values, corner cases… Showing all intermediate results may be detrimental to that quick change in perspective. Too much information.
There has been experiments with visual programming, trying to represent code as visual blocks in one way or another, since a long time ago (at least 25 years). They are useful in certain areas, but they are far from a general solution. There are also interactive notepads to allow easy display of graphs and help with the interactivity. iPython Notebook is an excellent example (and a very similar idea to the playground). But, again, I feel that those are specialised tools, not something that is that useful in most programming contexts.
I’m just skeptic. All of this doesn’t necessarily means that Swift is bad, or that those tools are wrong. Maybe the new X-Code will have a lot of amazing tools that will help create fantastic applications (I still don’t like IDEs, though). There are already people checking the docs and giving a try to the new language. But I think that it has to show up how good or bad it is for itself, and by the developers that decide to use it. So far, it is just an announcement. I just feel that most that was said on the keynote was not relevant to determine whether it’s a good working environment or not, but was just a gimmick. Yes, obviously these kind of announcements are publicity stunts, but in this particular case it looks especially so.
Looks cool, but is not particularly relevant to how the mental process of programming works or what makes a language good.
Compendium of Wondrous Links vol V
- Seven habits of effective text editing. A great essay by Bram Moolenaar (of Vim fame). It is applicable to any editor, but, of course, shows why Vim can be such a good choice (once you know how to use it, obviously)
- A useful collection of recipes in Python. Thirty python language features and tricks you may not know
- How to be a sane programmer. Basically, do other stuff not related to programming. The related Business Insider article is also worth the read.
- The Evolution of a Software Engineer
- D/A and A/D Digital Show and tell. Great explanation on how sampling and analog conversion works. I spent my college years dealing with this stuff (and using the same equipment), it is explained beautifully.
- 10 important URLs that every single Google user needs to know Interesting stuff about privacy and Google.
- A glass breaking recorded at high speed.
- How to Create an Awesome Candidate Experience. On thing that I particularly liked about it is the fact that it acknowledges how emotionally exhausting is to go through a recruitment process for the candidate.
- Game servers UDP vs TCP. great article about the differences between TCP and UDP, usually not well understood.
- Amazing precision. The art of Street Typography.
- I love this quote: “I do not want to be a “rock star”. I want to be a good engineer on a great engineering team“. I talked previously about this, and how a great team will be much more productive than a bunch of “Ninja Developers”.
The amazing forgiveness of software
One of the things I like most about developing software is the fact that you can recover from most mistakes with very few long term impact.
Bugs are unavoidable, and most of the people involved on programming deeply understands that is something we all live with. So, there’s no hard feelings, once you find a bug, you fix it and immediately move on. Not only no one thinks that you’re a bad developer because you write bugs, but typically the impact of a bug is not that problematic.
Yes, there are some bugs that are just terrible. And there’s always the risk of losing data or do some catastrophic operation on production. But those are comparatively rare, and with the proper practices, the risk and damage can be reduced. Most on the day to day operation involves mistakes that have a much limited effect. Software development is more about being bold and move fast fixing your mess, than it is to play safe (within limits, of course).
Because the greatness of software is that you can break it on purpose and watch it explode, and then fix that problem. In a controlled environment. Without having to worry about permanent effects or high costs. And a good test is the one that ambushes the code and try to viciously stab it with a poisonous dagger. The one that can hurt. So you know that your system is strong enough against that attack. And then iterate. Quickly. It’s like having a permanent second chance to try again.
Not every aspect of live is that forgiving. I guess that a lot of doctors would love to be able to do the same.