Long bet: Java, PHP, JavaScript, Python, Ruby, and C# will be the last languages that achieve a critical mass such that they can sustain developers through their whole career. From here on out, it’s a melting pot of framework and technology choices.

@akk I'm curious what makes you think that.

I'm kind of on the fence - there definitely haven't been any new languages in that group for a pretty long time, and I'm not sure I've seen any real breakthrough ideas...

Though throughout our short history it has seemed like as soon as it looks like language evolution is done, some new popular thing seemingly comes out of nowhere...

@scottwerner @akk Are Haskell and OCaml "before" the ones listed above? Or not in the category?

I'm not sure there's really some tipping point in language adoption that makes something in "career bucket" vs not.

@jamescgibson @akk yeah, maybe I'm misreading @akk 's original toot, but the languages he mentions are just massive, with Ruby probably being the smallest in that group by a huge margin.

Career/Critical mass to me is like:
- large % of universities standardize on language for core curriculum: java, c#, javascript, python

- support massive industry of freelancers: php

- so many open jobs that bootcamps pop up across the country: ruby, python, javascript

@jamescgibson @akk My question was more around "why is this time different?"

In that original toot even though release dates are close, there are essentially two generations of popularity:
1: Java, C#, PHP
2: Python, Ruby, Javascript

Mostly those generational shifts happened because of how much less work is required to accomplish your work (additional layer of abstraction) Java and C# were that to a generation before it in the same way that Fortran was to Assembly

@scottwerner @akk I think it's clear "this time is different" basically because of the argument laid out in "No Silver Bullet".

The improvements in total cost of engineering available from languages are, for the time being, tapped out.

The only promising mechanisms for reducing total cost of engineering today are

(1) super-application level, like serverless architectures
(2) more and better use of off the shelf software (again, No Silver Bullet)
(3) maybe a breakthrough in type theory.

@scottwerner @akk

I'd argue that the success of ruby and python is less even their language properties as their very good standard libraries (rails included) and their very good, very early package management (less so for python, but still way better than e.g. C#).

The engineering benefits of these "languages" was from their package systems and strong open source communities.

@scottwerner @akk

If we take e.g. Hickey and PG's writings at face value, the lisps that we have already developed already represent the pinnacle of language power and expressiveness; I think there are good reasons to believe this is true.

We've already found that it is incredibly difficult to assemble a good enough team to wield those languages (e.g. Viaweb).

The real benefits of the ruby/python era have come from "off the shelf" composable parts (again, see No Silver Bullet).

@scottwerner @akk

RE: JS, I think the only reason JS caught on as much - and it only did *after* python and ruby - was that everyone had to learn it for web, making it the lowest common denominator, and then after seeing python and ruby's package manager successes, the JS community built copies.

@jamescgibson @akk Right, that's why I'm kind of on the fence, though:
better package management
microframeworks (sinatra)
REST web frameworks (Rails)
containerization/12 factor (heroku)

to name a few, could have all been done in any other languages previously, what was it about the early 2000's and Ruby that caused that explosion?

I'm on the fence since maybe there is something about a language that frees people up to think new thoughts.


@scottwerner @akk That's a good point.

Culturally, ruby's interest in developer ergonomics definitely plays a role. Steve Klabnik has spoken to this for Rust - when he joined Rust, and brought some rubyists with him, one of the things that he thinks carried over was a focus on dev ergonomics. It just wasn't acceptable to e.g. have cryptic error messages, if a more useful error message could be generated.

· · Web · 2 · 0 · 0

@scottwerner @akk

The green fields of a new language definitely help. Why would I write bundler if ant or maven already exists, even if ant or maven have serious issues? It's a much harder bet to make when there's an alternative, no matter how bad it is.

@jamescgibson You definitely see that now, though I'm not sure how much that's just looking at the situation with hindsight and trying to come up with a guiding principle.

A lot of the Ruby stuff came from _why's ideas, and I don't think he was really thinking explicitly about ergonomics rather than trying to entertain and challenge himself and others. Camping inspiring Sinatra for example

@scottwerner That's probably true, but maybe I just see "developer ergonomics" as the corporate speak for "making programmers happy". I think pursuing joy in programming has always been central, from what I've read of Matz.

Maybe "developer ergonomics" is just how you sell joy to your boss.

@jamescgibson ah yeah, MINSWAN and all that. You’re right, I guess it was a guiding principle from day 1. DHH probably points to that idea too.

@jamescgibson but ok, if different starting principles can make ideas that were previously hard to think easy to think, why should we be at a “no silver bullet” stage now? Rather than any of the other times in the past?

@scottwerner The argument from the paper really lays it out - for software, there's fundamental and incidental complexity. Fundamental complexity cannot be abstracted away.

If you the programmer aren't spending >=90% of your time on incidental complexity, even eliminating *all* incidental complexity will not lead to an order-of-magnitude improvement.

I'd venture that if you're already using a modern system, like Rails, <90% of programmer time is spent on incidental complexity.

@scottwerner Now, to connect this to conversations we've had in the past - I think there *is* still an argument for redefining what is incidental complexity - preventing users from requesting that the gas and brake pedal be a different distance apart.

But that's changing the user's / customer's expectations, not building the same system faster - which is still a valid solution, just outside the scope of "No Silver Bullet"

@jamescgibson I guess I'll have to read the paper, but from the wikipedia summary:

there is no single development, in either technology or management technique, which by itself promises even one order of magnitude [tenfold] improvement within a decade in productivity, in reliability, in simplicity.

Being written in 1986, I feel like we've seen it proven wrong two decades in a row since the paper was written, with those two generations from the original toot.

@scottwerner The paper is definitely worth a read. I'd venture that we have experiences one 10x improvement, but I'm not sure of two.

It of course depends on how "speed" is measured here as well.

@scottwerner Apparently he wrote some reflections in the newer editions of the Mythical Man Month, so I guess I'll have to go track down a copy

@scottwerner Actually, arguably, he was right: even if you count Java and C# as successes, Java was only 1.0 released in 1996, 10 years after the paper was published - C# not until 2000.

The C2 Wiki discussion is good - at least one comment does list Java and DotNet as silver bulletrs


And he does, in the paper, specifically mention OO and high level languages (mentioning Ada) as hopes for Silver Bullets.

@jamescgibson Ok, but if we're going to count java as a success there are we now debating if the silver bullet happens after 12 or 13 years instead of the original 10?

@scottwerner Yeah haha, sorry I was just being cheeky.

I'd agree between the two generations you mentioned we definitely have a 10x speedup. Not sure about 100x.

And, to be fair, I've always thought 10x was silly; a 2x improvement would be amazing - and I definitely agree the two generations you mention each brought 2x, as did cloud.

@scottwerner Still, today, looking at my work: I probably spend 60% of my time dealing with fundamental complexity, even if that fundamental complexity is only fundamental because of customer requirements. So I'm not hopeful for even another 2x, in the field I'm in now.

For greenfield software, I think there are good opportunities for 2x speedups. Still skeptical about 10x.

@scottwerner To reconnect to the original tweet - I think any practice that can deliver 2x speedup has a good chance of becoming "mainstream" (for lack of a better term) and widespread.

I'm not sure any new language can do that for general problems.

@jamescgibson :) yeah though that's I think the question I have - why would this time be different?

Have we really gotten rid of all the incidental complexity? I'm not convinced we have.

I'd be curious to see what you're working on and what makes it different from things I've worked on in the past.

@scottwerner School SIS system integrations - so lots of what was incidental decisions for the original author of the software we're integrating with becomes fundamental complexity for us.

Really, though, probably 50% of our team's time is spent on external communications and requirements gathering - no language is going to solve that.

Sign in to participate in the conversation
Refactor Camp

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!