In 2009 I started the Alan Turing petition. Perhaps you read about in WIRED back in 2014. This is my first-hand account of how I started the petition, automated my father with Perl scripts, convinced the UK to apologise to Alan Turing, and received a personal phone call from the British Prime Minister.
Need a history lesson of CSS and web design? Take a long journey with Eevee on the subject…
I first got into web design/development in the late 90s, and only as I type this sentence do I realize how long ago that was. And boy, it was horrendous. I mean, being able to make stuff and put it online where other people could see it was pretty slick, but we did not have very much to work with.
I’ve been taking for granted that most folks doing web stuff still remember those days, or at least the decade that followed, but I think that assumption might be a wee bit out of date. Some time ago I encountered a tweet marvelling at what we had to do without
border-radius. I still remember waiting with bated breath for it to be unprefixed!
This excellent post is a mix of history and possible futures:
As the WWW spread, it grew features. Soon, it was not enough for the documents to contain just text: support for images was added. People wanted to customize the look of the documents, so HTML gained presentational markup abilities, eventually obsoleted by CSS. It was not enough to be able to view the menu of your local pizza store – people wanted to actually order a pizza: the need for sessions yielded cookies and non-idempotent HTTP methods. And people wanted the pages to be interactive, so they became scriptable.
All these features were good. They helped the Web meet actual needs. But having them has a significant consequence, one that is seldom realized:
We don’t have a Web of Documents anymore.
Daniel goes on to argue that what we have today is a Web of Applications, but he believes we can recreate the old web by adding just three restraints…
While many of us writing our year-end wrap-ups, Matt Asay saunters into the room, kindly requests that we “hold his beer”, and proceeds to write his decade-end wrap-up.
We’re about to conclude another decade of open source, and what a long, strange trip it has been. Reading back through predictions made in 2009, no one had the foggiest clue that GitHub would change software development forever (and for everyone), or that Microsoft would go from open source pariah to the world’s largest contributor, or a host of other dramatic changes that became the new normal during a decade that was anything but normal.
We are all open sourcerors now as we round out the decade. Let’s look back at some of the most significant open source innovations that got us here.
Cory Doctorow goes deep into Usenet’s history and uncovers a sage decision by the “backbone cabal” which may help us improve the web’s (currently centralized) state:
Restoring adversarial interoperability will allow future companies, co-operatives and tinkerers to go beyond the comfort zones of the winners of the previous rounds of the game – so that it ceases to be a winner-take-all affair, and instead becomes the kind of dynamic place where a backbone cabal can have total control one year, and be sidelined the next.
As the saying goes… history doesn’t repeat itself, but it often rhymes.
If you want to create a successful programming language (or at least understand how you might), it’s immensely valuable to learn from others who have done just that. on Go Time episode #100, two of Go’s creators (Rob Pike and Robert Griesemer) sat down to discuss the language’s success. Here’s 5 things they attribute to its success.
Nice piece by Slate:
To shed light on the software that has tilted the world on its axis, the editors polled computer scientists, software developers, historians, policymakers, and journalists. They were asked to pick: Which pieces of code had a huge influence? Which ones warped our lives? About 75 responded with all sorts of ideas, and Slate has selected 36.
A nice, quick read about resourceful engineering teams turning lemons into lemonade. It might be a stretch to call all of these bugs, but it’s worth stretching a little bit in this case. Did you know Gmail’s “unsend” feature never would’ve happened were it not for a performance bug?!
Version Museum showcases the visual history of popular websites, operating systems, applications, and games that have shaped our lives.
I freakin’ love this site. They have quite a collection here, everything from Amazon.com and Google Maps to Mac OS and Super Mario Kart. Version 1.5 of Microsoft Excel was dope! (full Excel history here)
Many people and companies have poorly interpreted Grace Hopper’s famous quote about getting things done inside bureaucracies. I’m here to set the record straight.
Original Apollo 11 guidance computer (AGC) source code for Command Module (Comanche055) and Lunar Module (Luminary099). Digitized by the folks at Virtual AGC and MIT Museum. The goal is to be a repo for the original Apollo 11 source code. As such, PRs are welcome for any issues identified between the transcriptions in this repository and the original source scans for Luminary 099 and Comanche 055, as well as any files I may have missed.
A nice bit of history to peruse in honor of the flight’s recent 50th anniversary. 100% Assembly tho 😱
It’s great to read RMS and other GNU developer’s perspective on how we got past the UNIX days. I’m particularly interested in a conversation around this statement from the author:
Open source discourse typically encourages certain practices for the sake of practical advantages, not as a moral imperative.
I’m fascinated by the different perspectives. There’s one where F/OSS is a human right, and another where it’s a business opportunity. They’re not mutually exclusive, but which is more prevalent these days?
My thought is that we wouldn’t be where we are today if the former didn’t dominate in the ‘90s, but we’re significantly more capitalistic with our OSS these days.
What’s your take on it?
E.W. Dijkstra, in an ACM lecture he delivered almost 50 years ago:
… the computer, by virtue of its fantastic speed, seems to be the first to provide us with an environment where highly hierarchical artifacts are both possible and necessary. This challenge, viz. the confrontation with the programming task, is so unique that this novel experience can teach us a lot about ourselves. It should deepen our understanding of the processes of design and creation, it should give us better control over the task of organizing our thoughts. If it did not do so, to my taste we should not deserve the computer at all!
Starring Lee Byron, Dan Schafer and Nick Schrock (co-creators of GraphQL) and other big names from the community [the documentary] explores the story of why and how GraphQL came to be and the impact it’s having on big tech companies worldwide.
It’s hard to believe it’s already been 9 years since Rust was first announced to the world. The New Stack has a nice interview with Graydon Hoare…
sharing his thoughts on everything from the state of systems programming, to the difficulty of defining safety on ever-more complex systems — and whether we’re truly more secure today, or confronting an inherited software mess that will take decades to clean up.
In the early days of computing, hardware was expensive and programmers were cheap.
I thoroughly enjoy Erik’s look back at the history that brought us here and the technology we rely on everyday whether it’s visible or not.
Following the sad news about Joe Armstrong passing away, some of his former colleagues from Ericsson wrote a good-bye note and asked if InfoQ would publish it.
Joe has been on my shortlist of people to invite on The Changelog for a long time, but I never got around to contacting him. Regretful. This is a touching tribute. I especially enjoyed this bit:
Nobody could avoid being affected by Joe’s good mood and boundless enthusiasm. He was highly appreciated as a speaker and panel member at many international conferences. Many programmers can testify to just how important Joe has been for them in developing their profession.
On Practical AI #41, Adam Berenzweig gave a sweeping history of human-computer interaction (HCI) and a glimpse into what the future might hold.